Dec 05 05:22:02 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 05 05:22:02 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 05 05:22:02 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 05:22:02 localhost kernel: BIOS-provided physical RAM map:
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 05 05:22:02 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000027fffffff] usable
Dec 05 05:22:02 localhost kernel: NX (Execute Disable) protection: active
Dec 05 05:22:02 localhost kernel: APIC: Static calls initialized
Dec 05 05:22:02 localhost kernel: SMBIOS 2.8 present.
Dec 05 05:22:02 localhost kernel: DMI: Red Hat OpenStack Compute/RHEL, BIOS 1.16.1-1.el9 04/01/2014
Dec 05 05:22:02 localhost kernel: Hypervisor detected: KVM
Dec 05 05:22:02 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 05 05:22:02 localhost kernel: kvm-clock: using sched offset of 3304755336 cycles
Dec 05 05:22:02 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 05 05:22:02 localhost kernel: tsc: Detected 2445.404 MHz processor
Dec 05 05:22:02 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 05 05:22:02 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 05 05:22:02 localhost kernel: last_pfn = 0x280000 max_arch_pfn = 0x400000000
Dec 05 05:22:02 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 05 05:22:02 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 05 05:22:02 localhost kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000
Dec 05 05:22:02 localhost kernel: found SMP MP-table at [mem 0x000f5b60-0x000f5b6f]
Dec 05 05:22:02 localhost kernel: Using GB pages for direct mapping
Dec 05 05:22:02 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 05 05:22:02 localhost kernel: ACPI: Early table checksum verification disabled
Dec 05 05:22:02 localhost kernel: ACPI: RSDP 0x00000000000F5B20 000014 (v00 BOCHS )
Dec 05 05:22:02 localhost kernel: ACPI: RSDT 0x000000007FFE35EB 000034 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: FACP 0x000000007FFE3403 0000F4 (v03 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: DSDT 0x000000007FFDFCC0 003743 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: FACS 0x000000007FFDFC80 000040
Dec 05 05:22:02 localhost kernel: ACPI: APIC 0x000000007FFE34F7 000090 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: MCFG 0x000000007FFE3587 00003C (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: WAET 0x000000007FFE35C3 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 05 05:22:02 localhost kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe3403-0x7ffe34f6]
Dec 05 05:22:02 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfcc0-0x7ffe3402]
Dec 05 05:22:02 localhost kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfc80-0x7ffdfcbf]
Dec 05 05:22:02 localhost kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe34f7-0x7ffe3586]
Dec 05 05:22:02 localhost kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe3587-0x7ffe35c2]
Dec 05 05:22:02 localhost kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe35c3-0x7ffe35ea]
Dec 05 05:22:02 localhost kernel: No NUMA configuration found
Dec 05 05:22:02 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000027fffffff]
Dec 05 05:22:02 localhost kernel: NODE_DATA(0) allocated [mem 0x27ffd5000-0x27fffffff]
Dec 05 05:22:02 localhost kernel: crashkernel reserved: 0x000000006f000000 - 0x000000007f000000 (256 MB)
Dec 05 05:22:02 localhost kernel: Zone ranges:
Dec 05 05:22:02 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 05 05:22:02 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 05 05:22:02 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000027fffffff]
Dec 05 05:22:02 localhost kernel:   Device   empty
Dec 05 05:22:02 localhost kernel: Movable zone start for each node
Dec 05 05:22:02 localhost kernel: Early memory node ranges
Dec 05 05:22:02 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 05 05:22:02 localhost kernel:   node   0: [mem 0x0000000000100000-0x000000007ffdafff]
Dec 05 05:22:02 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000027fffffff]
Dec 05 05:22:02 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000027fffffff]
Dec 05 05:22:02 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 05 05:22:02 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 05 05:22:02 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 05 05:22:02 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 05 05:22:02 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 05 05:22:02 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 05 05:22:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 05 05:22:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 05 05:22:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 05 05:22:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 05 05:22:02 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 05 05:22:02 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 05 05:22:02 localhost kernel: TSC deadline timer available
Dec 05 05:22:02 localhost kernel: CPU topo: Max. logical packages:   4
Dec 05 05:22:02 localhost kernel: CPU topo: Max. logical dies:       4
Dec 05 05:22:02 localhost kernel: CPU topo: Max. dies per package:   1
Dec 05 05:22:02 localhost kernel: CPU topo: Max. threads per core:   1
Dec 05 05:22:02 localhost kernel: CPU topo: Num. cores per package:     1
Dec 05 05:22:02 localhost kernel: CPU topo: Num. threads per package:   1
Dec 05 05:22:02 localhost kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs
Dec 05 05:22:02 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 05 05:22:02 localhost kernel: kvm-guest: KVM setup pv remote TLB flush
Dec 05 05:22:02 localhost kernel: kvm-guest: setup PV sched yield
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x7ffdb000-0x7fffffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x80000000-0xafffffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xb0000000-0xbfffffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfed1bfff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed1c000-0xfed1ffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfed20000-0xfeffbfff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 05 05:22:02 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 05 05:22:02 localhost kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices
Dec 05 05:22:02 localhost kernel: Booting paravirtualized kernel on KVM
Dec 05 05:22:02 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 05 05:22:02 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1
Dec 05 05:22:02 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u524288
Dec 05 05:22:02 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u524288 alloc=1*2097152
Dec 05 05:22:02 localhost kernel: pcpu-alloc: [0] 0 1 2 3 
Dec 05 05:22:02 localhost kernel: kvm-guest: PV spinlocks enabled
Dec 05 05:22:02 localhost kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear)
Dec 05 05:22:02 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 05:22:02 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 05 05:22:02 localhost kernel: random: crng init done
Dec 05 05:22:02 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 05 05:22:02 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 05 05:22:02 localhost kernel: Fallback order for Node 0: 0 
Dec 05 05:22:02 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 05 05:22:02 localhost kernel: Policy zone: Normal
Dec 05 05:22:02 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 05 05:22:02 localhost kernel: software IO TLB: area num 4.
Dec 05 05:22:02 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1
Dec 05 05:22:02 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 05 05:22:02 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 05 05:22:02 localhost kernel: Dynamic Preempt: voluntary
Dec 05 05:22:02 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 05 05:22:02 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 05 05:22:02 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=4.
Dec 05 05:22:02 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 05 05:22:02 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 05 05:22:02 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 05 05:22:02 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 05 05:22:02 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4
Dec 05 05:22:02 localhost kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Dec 05 05:22:02 localhost kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Dec 05 05:22:02 localhost kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4.
Dec 05 05:22:02 localhost kernel: NR_IRQS: 524544, nr_irqs: 456, preallocated irqs: 16
Dec 05 05:22:02 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 05 05:22:02 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 05 05:22:02 localhost kernel: Console: colour VGA+ 80x25
Dec 05 05:22:02 localhost kernel: printk: console [ttyS0] enabled
Dec 05 05:22:02 localhost kernel: ACPI: Core revision 20230331
Dec 05 05:22:02 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 05 05:22:02 localhost kernel: x2apic enabled
Dec 05 05:22:02 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 05 05:22:02 localhost kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask()
Dec 05 05:22:02 localhost kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself()
Dec 05 05:22:02 localhost kernel: kvm-guest: setup PV IPIs
Dec 05 05:22:02 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 05 05:22:02 localhost kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404)
Dec 05 05:22:02 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 05 05:22:02 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 05 05:22:02 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 05 05:22:02 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 05 05:22:02 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 05 05:22:02 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 05 05:22:02 localhost kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls
Dec 05 05:22:02 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 05 05:22:02 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 05 05:22:02 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 05 05:22:02 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 05 05:22:02 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 05 05:22:02 localhost kernel: Transient Scheduler Attacks: Vulnerable: No microcode
Dec 05 05:22:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 05 05:22:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 05 05:22:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 05 05:22:02 localhost kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers'
Dec 05 05:22:02 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 05 05:22:02 localhost kernel: x86/fpu: xstate_offset[9]:  832, xstate_sizes[9]:    8
Dec 05 05:22:02 localhost kernel: x86/fpu: Enabled xstate features 0x207, context size is 840 bytes, using 'compacted' format.
Dec 05 05:22:02 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 05 05:22:02 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 05 05:22:02 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 05 05:22:02 localhost kernel: landlock: Up and running.
Dec 05 05:22:02 localhost kernel: Yama: becoming mindful.
Dec 05 05:22:02 localhost kernel: SELinux:  Initializing.
Dec 05 05:22:02 localhost kernel: LSM support for eBPF active
Dec 05 05:22:02 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 05:22:02 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 05 05:22:02 localhost kernel: smpboot: CPU0: AMD EPYC 7763 64-Core Processor (family: 0x19, model: 0x1, stepping: 0x1)
Dec 05 05:22:02 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 05 05:22:02 localhost kernel: ... version:                0
Dec 05 05:22:02 localhost kernel: ... bit width:              48
Dec 05 05:22:02 localhost kernel: ... generic registers:      6
Dec 05 05:22:02 localhost kernel: ... value mask:             0000ffffffffffff
Dec 05 05:22:02 localhost kernel: ... max period:             00007fffffffffff
Dec 05 05:22:02 localhost kernel: ... fixed-purpose events:   0
Dec 05 05:22:02 localhost kernel: ... event mask:             000000000000003f
Dec 05 05:22:02 localhost kernel: signal: max sigframe size: 3376
Dec 05 05:22:02 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 05 05:22:02 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 05 05:22:02 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 05 05:22:02 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 05 05:22:02 localhost kernel: .... node  #0, CPUs:      #1 #2 #3
Dec 05 05:22:02 localhost kernel: smp: Brought up 1 node, 4 CPUs
Dec 05 05:22:02 localhost kernel: smpboot: Total of 4 processors activated (19563.23 BogoMIPS)
Dec 05 05:22:02 localhost kernel: node 0 deferred pages initialised in 9ms
Dec 05 05:22:02 localhost kernel: Memory: 7766176K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 617156K reserved, 0K cma-reserved)
Dec 05 05:22:02 localhost kernel: devtmpfs: initialized
Dec 05 05:22:02 localhost kernel: x86/mm: Memory block size: 128MB
Dec 05 05:22:02 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 05 05:22:02 localhost kernel: futex hash table entries: 1024 (65536 bytes on 1 NUMA nodes, total 64 KiB, linear).
Dec 05 05:22:02 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 05 05:22:02 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 05 05:22:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 05 05:22:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 05 05:22:02 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 05 05:22:02 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 05 05:22:02 localhost kernel: audit: type=2000 audit(1764912122.148:1): state=initialized audit_enabled=0 res=1
Dec 05 05:22:02 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 05 05:22:02 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 05 05:22:02 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 05 05:22:02 localhost kernel: cpuidle: using governor menu
Dec 05 05:22:02 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 05 05:22:02 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff]
Dec 05 05:22:02 localhost kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry
Dec 05 05:22:02 localhost kernel: PCI: Using configuration type 1 for base access
Dec 05 05:22:02 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 05 05:22:02 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 05 05:22:02 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 05 05:22:02 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 05 05:22:02 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 05 05:22:02 localhost kernel: Demotion targets for Node 0: null
Dec 05 05:22:02 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 05 05:22:02 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 05 05:22:02 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 05 05:22:02 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 05 05:22:02 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 05 05:22:02 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 05 05:22:02 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 05 05:22:02 localhost kernel: ACPI: Interpreter enabled
Dec 05 05:22:02 localhost kernel: ACPI: PM: (supports S0 S5)
Dec 05 05:22:02 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 05 05:22:02 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 05 05:22:02 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 05 05:22:02 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 3F
Dec 05 05:22:02 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 05 05:22:02 localhost kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 05 05:22:02 localhost kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR DPC]
Dec 05 05:22:02 localhost kernel: acpi PNP0A08:00: _OSC: OS now controls [SHPCHotplug PME AER PCIeCapability]
Dec 05 05:22:02 localhost kernel: PCI host bridge to bus 0000:00
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x280000000-0xa7fffffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: BAR 0 [mem 0xf9800000-0xf9ffffff pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: BAR 2 [mem 0xfc200000-0xfc203fff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: BAR 0 [mem 0xfea1a000-0xfea1afff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: BAR 0 [mem 0xfea1b000-0xfea1bfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: BAR 0 [mem 0xfea1c000-0xfea1cfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: BAR 0 [mem 0xfea1d000-0xfea1dfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: BAR 0 [mem 0xfea1e000-0xfea1efff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: BAR 0 [mem 0xfea1f000-0xfea1ffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: BAR 0 [mem 0xfea20000-0xfea20fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: BAR 0 [mem 0xfea21000-0xfea21fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.0: quirk: [io  0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.2: BAR 4 [io  0xd040-0xd05f]
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea22000-0xfea22fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:00:1f.3: BAR 4 [io  0x0700-0x073f]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0: BAR 0 [mem 0xfc800000-0xfc8000ff 64bit]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:02: extended config space not accessible
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [1] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [2] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [3] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [4] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [5] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [6] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [7] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [8] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [9] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [10] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [11] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [12] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [13] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [14] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [15] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [16] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [17] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [18] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [19] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [20] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [21] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [22] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [23] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [24] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [25] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [26] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [27] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [28] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [29] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [30] registered
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [31] registered
Dec 05 05:22:02 localhost kernel: pci 0000:02:01.0: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:02:01.0: BAR 4 [io  0xc000-0xc01f]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-2] registered
Dec 05 05:22:02 localhost kernel: pci 0000:03:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe840000-0xfe840fff]
Dec 05 05:22:02 localhost kernel: pci 0000:03:00.0: BAR 4 [mem 0xfbe00000-0xfbe03fff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:03:00.0: ROM [mem 0xfe800000-0xfe83ffff pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-3] registered
Dec 05 05:22:02 localhost kernel: pci 0000:04:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:04:00.0: BAR 1 [mem 0xfe600000-0xfe600fff]
Dec 05 05:22:02 localhost kernel: pci 0000:04:00.0: BAR 4 [mem 0xfbc00000-0xfbc03fff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-4] registered
Dec 05 05:22:02 localhost kernel: pci 0000:05:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:05:00.0: BAR 4 [mem 0xfba00000-0xfba03fff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-5] registered
Dec 05 05:22:02 localhost kernel: pci 0000:06:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint
Dec 05 05:22:02 localhost kernel: pci 0000:06:00.0: BAR 4 [mem 0xfb800000-0xfb803fff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-6] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-7] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-8] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-9] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-10] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-11] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-12] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-13] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-14] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-15] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-16] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Dec 05 05:22:02 localhost kernel: acpiphp: Slot [0-17] registered
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22
Dec 05 05:22:02 localhost kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23
Dec 05 05:22:02 localhost kernel: iommu: Default domain type: Translated
Dec 05 05:22:02 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 05 05:22:02 localhost kernel: SCSI subsystem initialized
Dec 05 05:22:02 localhost kernel: ACPI: bus type USB registered
Dec 05 05:22:02 localhost kernel: usbcore: registered new interface driver usbfs
Dec 05 05:22:02 localhost kernel: usbcore: registered new interface driver hub
Dec 05 05:22:02 localhost kernel: usbcore: registered new device driver usb
Dec 05 05:22:02 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 05 05:22:02 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 05 05:22:02 localhost kernel: PTP clock support registered
Dec 05 05:22:02 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 05 05:22:02 localhost kernel: NetLabel: Initializing
Dec 05 05:22:02 localhost kernel: NetLabel:  domain hash size = 128
Dec 05 05:22:02 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 05 05:22:02 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 05 05:22:02 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 05 05:22:02 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 05 05:22:02 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 05 05:22:02 localhost kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: vgaarb: bridge control possible
Dec 05 05:22:02 localhost kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 05 05:22:02 localhost kernel: vgaarb: loaded
Dec 05 05:22:02 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 05 05:22:02 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 05 05:22:02 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 05 05:22:02 localhost kernel: pnp: PnP ACPI init
Dec 05 05:22:02 localhost kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved
Dec 05 05:22:02 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 05 05:22:02 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 05 05:22:02 localhost kernel: NET: Registered PF_INET protocol family
Dec 05 05:22:02 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 05 05:22:02 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 05 05:22:02 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 05 05:22:02 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 05 05:22:02 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 05 05:22:02 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 05 05:22:02 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 05 05:22:02 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 05:22:02 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 05 05:22:02 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 05 05:22:02 localhost kernel: NET: Registered PF_XDP protocol family
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x0fff] to [bus 03] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: bridge window [io  0x1000-0x0fff] to [bus 04] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: bridge window [io  0x1000-0x0fff] to [bus 05] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: bridge window [io  0x1000-0x0fff] to [bus 06] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: bridge window [io  0x1000-0x0fff] to [bus 07] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: bridge window [io  0x1000-0x0fff] to [bus 08] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: bridge window [io  0x1000-0x0fff] to [bus 09] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: bridge window [io  0x1000-0x0fff] to [bus 0a] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: bridge window [io  0x1000-0x0fff] to [bus 0b] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: bridge window [io  0x1000-0x0fff] to [bus 0c] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: bridge window [io  0x1000-0x0fff] to [bus 0d] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: bridge window [io  0x1000-0x0fff] to [bus 0e] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: bridge window [io  0x1000-0x0fff] to [bus 0f] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: bridge window [io  0x1000-0x0fff] to [bus 10] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: bridge window [io  0x1000-0x0fff] to [bus 11] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x0fff] to [bus 12] add_size 1000
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: bridge window [io  0x1000-0x1fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: bridge window [io  0x2000-0x2fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: bridge window [io  0x3000-0x3fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: bridge window [io  0x4000-0x4fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: bridge window [io  0x5000-0x5fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: bridge window [io  0x6000-0x6fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: bridge window [io  0x7000-0x7fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: bridge window [io  0x8000-0x8fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: bridge window [io  0x9000-0x9fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: bridge window [io  0xa000-0xafff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: bridge window [io  0xb000-0xbfff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: bridge window [io  0xe000-0xefff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: bridge window [io  0xf000-0xffff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: bridge window [io  0x1000-0x1fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: bridge window [io  0x2000-0x2fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: bridge window [io  0x3000-0x3fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: bridge window [io  0x4000-0x4fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: bridge window [io  0x5000-0x5fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: bridge window [io  0x6000-0x6fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: bridge window [io  0x7000-0x7fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: bridge window [io  0x8000-0x8fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: bridge window [io  0x9000-0x9fff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: bridge window [io  0xa000-0xafff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: bridge window [io  0xb000-0xbfff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: bridge window [io  0xe000-0xefff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: bridge window [io  0xf000-0xffff]: assigned
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: can't assign; no space
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: bridge window [io  size 0x1000]: failed to assign
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0: PCI bridge to [bus 02]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc600000-0xfc7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:01:00.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc600000-0xfc9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.0:   bridge window [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1: PCI bridge to [bus 03]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfe800000-0xfe9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.1:   bridge window [mem 0xfbe00000-0xfbffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2: PCI bridge to [bus 04]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfe600000-0xfe7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.2:   bridge window [mem 0xfbc00000-0xfbdfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3: PCI bridge to [bus 05]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfe400000-0xfe5fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.3:   bridge window [mem 0xfba00000-0xfbbfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4: PCI bridge to [bus 06]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4:   bridge window [io  0xf000-0xffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfe200000-0xfe3fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.4:   bridge window [mem 0xfb800000-0xfb9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5: PCI bridge to [bus 07]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5:   bridge window [io  0xe000-0xefff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfe000000-0xfe1fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.5:   bridge window [mem 0xfb600000-0xfb7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6: PCI bridge to [bus 08]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6:   bridge window [io  0xb000-0xbfff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfde00000-0xfdffffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.6:   bridge window [mem 0xfb400000-0xfb5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7: PCI bridge to [bus 09]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7:   bridge window [io  0xa000-0xafff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfdc00000-0xfddfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:02.7:   bridge window [mem 0xfb200000-0xfb3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0: PCI bridge to [bus 0a]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0:   bridge window [io  0x9000-0x9fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfda00000-0xfdbfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.0:   bridge window [mem 0xfb000000-0xfb1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1: PCI bridge to [bus 0b]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1:   bridge window [io  0x8000-0x8fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfd800000-0xfd9fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.1:   bridge window [mem 0xfae00000-0xfaffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2: PCI bridge to [bus 0c]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2:   bridge window [io  0x7000-0x7fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfd600000-0xfd7fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.2:   bridge window [mem 0xfac00000-0xfadfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3: PCI bridge to [bus 0d]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3:   bridge window [io  0x6000-0x6fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfd400000-0xfd5fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.3:   bridge window [mem 0xfaa00000-0xfabfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4: PCI bridge to [bus 0e]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4:   bridge window [io  0x5000-0x5fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfd200000-0xfd3fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.4:   bridge window [mem 0xfa800000-0xfa9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5: PCI bridge to [bus 0f]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5:   bridge window [io  0x4000-0x4fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfd000000-0xfd1fffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.5:   bridge window [mem 0xfa600000-0xfa7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6: PCI bridge to [bus 10]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6:   bridge window [io  0x3000-0x3fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfce00000-0xfcffffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.6:   bridge window [mem 0xfa400000-0xfa5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7: PCI bridge to [bus 11]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7:   bridge window [io  0x2000-0x2fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfcc00000-0xfcdfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:03.7:   bridge window [mem 0xfa200000-0xfa3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0: PCI bridge to [bus 12]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0:   bridge window [io  0x1000-0x1fff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfca00000-0xfcbfffff]
Dec 05 05:22:02 localhost kernel: pci 0000:00:04.0:   bridge window [mem 0xfa000000-0xfa1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:00: resource 9 [mem 0x280000000-0xa7fffffff window]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:01: resource 0 [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:01: resource 1 [mem 0xfc600000-0xfc9fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:01: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:02: resource 0 [io  0xc000-0xcfff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:02: resource 1 [mem 0xfc600000-0xfc7fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:02: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:03: resource 2 [mem 0xfbe00000-0xfbffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:04: resource 2 [mem 0xfbc00000-0xfbdfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:05: resource 2 [mem 0xfba00000-0xfbbfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:06: resource 0 [io  0xf000-0xffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:06: resource 2 [mem 0xfb800000-0xfb9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:07: resource 0 [io  0xe000-0xefff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:07: resource 2 [mem 0xfb600000-0xfb7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:08: resource 0 [io  0xb000-0xbfff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:08: resource 2 [mem 0xfb400000-0xfb5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:09: resource 0 [io  0xa000-0xafff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:09: resource 2 [mem 0xfb200000-0xfb3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0a: resource 0 [io  0x9000-0x9fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0a: resource 1 [mem 0xfda00000-0xfdbfffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0a: resource 2 [mem 0xfb000000-0xfb1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0b: resource 0 [io  0x8000-0x8fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0b: resource 1 [mem 0xfd800000-0xfd9fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0b: resource 2 [mem 0xfae00000-0xfaffffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0c: resource 0 [io  0x7000-0x7fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0c: resource 1 [mem 0xfd600000-0xfd7fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0c: resource 2 [mem 0xfac00000-0xfadfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0d: resource 0 [io  0x6000-0x6fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0d: resource 1 [mem 0xfd400000-0xfd5fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0d: resource 2 [mem 0xfaa00000-0xfabfffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0e: resource 0 [io  0x5000-0x5fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0e: resource 1 [mem 0xfd200000-0xfd3fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0e: resource 2 [mem 0xfa800000-0xfa9fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0f: resource 0 [io  0x4000-0x4fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0f: resource 1 [mem 0xfd000000-0xfd1fffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:0f: resource 2 [mem 0xfa600000-0xfa7fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:10: resource 0 [io  0x3000-0x3fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:10: resource 1 [mem 0xfce00000-0xfcffffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:10: resource 2 [mem 0xfa400000-0xfa5fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:11: resource 0 [io  0x2000-0x2fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:11: resource 1 [mem 0xfcc00000-0xfcdfffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:11: resource 2 [mem 0xfa200000-0xfa3fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:12: resource 0 [io  0x1000-0x1fff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:12: resource 1 [mem 0xfca00000-0xfcbfffff]
Dec 05 05:22:02 localhost kernel: pci_bus 0000:12: resource 2 [mem 0xfa000000-0xfa1fffff 64bit pref]
Dec 05 05:22:02 localhost kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22
Dec 05 05:22:02 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 05 05:22:02 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 05 05:22:02 localhost kernel: software IO TLB: mapped [mem 0x000000006b000000-0x000000006f000000] (64MB)
Dec 05 05:22:02 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 05 05:22:02 localhost kernel: ACPI: bus type thunderbolt registered
Dec 05 05:22:02 localhost kernel: Initialise system trusted keyrings
Dec 05 05:22:02 localhost kernel: Key type blacklist registered
Dec 05 05:22:02 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 05 05:22:02 localhost kernel: zbud: loaded
Dec 05 05:22:02 localhost kernel: integrity: Platform Keyring initialized
Dec 05 05:22:02 localhost kernel: integrity: Machine keyring initialized
Dec 05 05:22:02 localhost kernel: Freeing initrd memory: 87804K
Dec 05 05:22:02 localhost kernel: NET: Registered PF_ALG protocol family
Dec 05 05:22:02 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 05 05:22:02 localhost kernel: Key type asymmetric registered
Dec 05 05:22:02 localhost kernel: Asymmetric key parser 'x509' registered
Dec 05 05:22:02 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 05 05:22:02 localhost kernel: io scheduler mq-deadline registered
Dec 05 05:22:02 localhost kernel: io scheduler kyber registered
Dec 05 05:22:02 localhost kernel: io scheduler bfq registered
Dec 05 05:22:02 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31
Dec 05 05:22:02 localhost kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 33
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 33
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 34
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 34
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 35
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 35
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 36
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 36
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 37
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 37
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 38
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 38
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 39
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 39
Dec 05 05:22:02 localhost kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 40
Dec 05 05:22:02 localhost kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 40
Dec 05 05:22:02 localhost kernel: shpchp 0000:01:00.0: HPC vendor_id 1b36 device_id e ss_vid 0 ss_did 0
Dec 05 05:22:02 localhost kernel: shpchp 0000:01:00.0: pci_hp_register failed with error -16
Dec 05 05:22:02 localhost kernel: shpchp 0000:01:00.0: Slot initialization failed
Dec 05 05:22:02 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 05 05:22:02 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 05 05:22:02 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 05 05:22:02 localhost kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21
Dec 05 05:22:02 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 05 05:22:02 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 05 05:22:02 localhost kernel: Non-volatile memory driver v1.3
Dec 05 05:22:02 localhost kernel: rdac: device handler registered
Dec 05 05:22:02 localhost kernel: hp_sw: device handler registered
Dec 05 05:22:02 localhost kernel: emc: device handler registered
Dec 05 05:22:02 localhost kernel: alua: device handler registered
Dec 05 05:22:02 localhost kernel: uhci_hcd 0000:02:01.0: UHCI Host Controller
Dec 05 05:22:02 localhost kernel: uhci_hcd 0000:02:01.0: new USB bus registered, assigned bus number 1
Dec 05 05:22:02 localhost kernel: uhci_hcd 0000:02:01.0: detected 2 ports
Dec 05 05:22:02 localhost kernel: uhci_hcd 0000:02:01.0: irq 22, io port 0x0000c000
Dec 05 05:22:02 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 05 05:22:02 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 05 05:22:02 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 05 05:22:02 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 05 05:22:02 localhost kernel: usb usb1: SerialNumber: 0000:02:01.0
Dec 05 05:22:02 localhost kernel: hub 1-0:1.0: USB hub found
Dec 05 05:22:02 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 05 05:22:02 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 05 05:22:02 localhost kernel: usbserial: USB Serial support registered for generic
Dec 05 05:22:02 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 05 05:22:02 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 05 05:22:02 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 05 05:22:02 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 05 05:22:02 localhost kernel: rtc_cmos 00:03: RTC can wake from S4
Dec 05 05:22:02 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 05 05:22:02 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 05 05:22:02 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 05 05:22:02 localhost kernel: rtc_cmos 00:03: registered as rtc0
Dec 05 05:22:02 localhost kernel: rtc_cmos 00:03: setting system clock to 2025-12-05T05:22:02 UTC (1764912122)
Dec 05 05:22:02 localhost kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram
Dec 05 05:22:02 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 05 05:22:02 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 05 05:22:02 localhost kernel: usbcore: registered new interface driver usbhid
Dec 05 05:22:02 localhost kernel: usbhid: USB HID core driver
Dec 05 05:22:02 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 05 05:22:02 localhost kernel: Initializing XFRM netlink socket
Dec 05 05:22:02 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 05 05:22:02 localhost kernel: Segment Routing with IPv6
Dec 05 05:22:02 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 05 05:22:02 localhost kernel: mpls_gso: MPLS GSO support
Dec 05 05:22:02 localhost kernel: IPI shorthand broadcast: enabled
Dec 05 05:22:02 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 05 05:22:02 localhost kernel: AES CTR mode by8 optimization enabled
Dec 05 05:22:02 localhost kernel: sched_clock: Marking stable (1101002406, 142290076)->(1310238235, -66945753)
Dec 05 05:22:02 localhost kernel: registered taskstats version 1
Dec 05 05:22:02 localhost kernel: Loading compiled-in X.509 certificates
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 05 05:22:02 localhost kernel: Demotion targets for Node 0: null
Dec 05 05:22:02 localhost kernel: page_owner is disabled
Dec 05 05:22:02 localhost kernel: Key type .fscrypt registered
Dec 05 05:22:02 localhost kernel: Key type fscrypt-provisioning registered
Dec 05 05:22:02 localhost kernel: Key type big_key registered
Dec 05 05:22:02 localhost kernel: Key type encrypted registered
Dec 05 05:22:02 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 05 05:22:02 localhost kernel: Loading compiled-in module X.509 certificates
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 05 05:22:02 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 05 05:22:02 localhost kernel: ima: No architecture policies found
Dec 05 05:22:02 localhost kernel: evm: Initialising EVM extended attributes:
Dec 05 05:22:02 localhost kernel: evm: security.selinux
Dec 05 05:22:02 localhost kernel: evm: security.SMACK64 (disabled)
Dec 05 05:22:02 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 05 05:22:02 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 05 05:22:02 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 05 05:22:02 localhost kernel: evm: security.apparmor (disabled)
Dec 05 05:22:02 localhost kernel: evm: security.ima
Dec 05 05:22:02 localhost kernel: evm: security.capability
Dec 05 05:22:02 localhost kernel: evm: HMAC attrs: 0x1
Dec 05 05:22:02 localhost kernel: Running certificate verification RSA selftest
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 05 05:22:02 localhost kernel: Running certificate verification ECDSA selftest
Dec 05 05:22:02 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 05 05:22:02 localhost kernel: clk: Disabling unused clocks
Dec 05 05:22:02 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 05 05:22:02 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 05 05:22:02 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 05 05:22:02 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 05 05:22:02 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 05 05:22:02 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 05 05:22:02 localhost kernel: Run /init as init process
Dec 05 05:22:02 localhost kernel:   with arguments:
Dec 05 05:22:02 localhost kernel:     /init
Dec 05 05:22:02 localhost kernel:   with environment:
Dec 05 05:22:02 localhost kernel:     HOME=/
Dec 05 05:22:02 localhost kernel:     TERM=linux
Dec 05 05:22:02 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 05 05:22:02 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 05:22:02 localhost systemd[1]: Detected virtualization kvm.
Dec 05 05:22:02 localhost systemd[1]: Detected architecture x86-64.
Dec 05 05:22:02 localhost systemd[1]: Running in initrd.
Dec 05 05:22:02 localhost systemd[1]: No hostname configured, using default hostname.
Dec 05 05:22:02 localhost systemd[1]: Hostname set to <localhost>.
Dec 05 05:22:02 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 05 05:22:02 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 05 05:22:02 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 05:22:02 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 05:22:02 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 05 05:22:02 localhost systemd[1]: Reached target Local File Systems.
Dec 05 05:22:02 localhost systemd[1]: Reached target Path Units.
Dec 05 05:22:02 localhost systemd[1]: Reached target Slice Units.
Dec 05 05:22:02 localhost systemd[1]: Reached target Swaps.
Dec 05 05:22:02 localhost systemd[1]: Reached target Timer Units.
Dec 05 05:22:02 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 05:22:02 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 05 05:22:02 localhost systemd[1]: Listening on Journal Socket.
Dec 05 05:22:02 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 05:22:02 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 05:22:02 localhost systemd[1]: Reached target Socket Units.
Dec 05 05:22:02 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 05:22:02 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 05 05:22:02 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 05 05:22:02 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 05 05:22:02 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 05 05:22:02 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:02.0:00.0:01.0-1
Dec 05 05:22:02 localhost systemd[1]: Starting Journal Service...
Dec 05 05:22:02 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 05:22:02 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 05:22:02 localhost systemd[1]: Starting Create System Users...
Dec 05 05:22:02 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.0/0000:01:00.0/0000:02:01.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 05 05:22:02 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:01.0-1/input0
Dec 05 05:22:02 localhost systemd[1]: Starting Setup Virtual Console...
Dec 05 05:22:02 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 05:22:02 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 05:22:02 localhost systemd[1]: Finished Create System Users.
Dec 05 05:22:02 localhost systemd-journald[281]: Journal started
Dec 05 05:22:02 localhost systemd-journald[281]: Runtime Journal (/run/log/journal/5b9d87813e9f40359ba69ba6577c62f7) is 8.0M, max 153.6M, 145.6M free.
Dec 05 05:22:02 localhost systemd-sysusers[284]: Creating group 'users' with GID 100.
Dec 05 05:22:02 localhost systemd-sysusers[284]: Creating group 'dbus' with GID 81.
Dec 05 05:22:02 localhost systemd-sysusers[284]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 05 05:22:02 localhost systemd[1]: Started Journal Service.
Dec 05 05:22:02 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 05:22:02 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 05:22:02 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 05:22:03 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 05:22:03 localhost systemd[1]: Finished Setup Virtual Console.
Dec 05 05:22:03 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 05 05:22:03 localhost systemd[1]: Starting dracut cmdline hook...
Dec 05 05:22:03 localhost dracut-cmdline[296]: dracut-9 dracut-057-102.git20250818.el9
Dec 05 05:22:03 localhost dracut-cmdline[296]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 05 05:22:03 localhost systemd[1]: Finished dracut cmdline hook.
Dec 05 05:22:03 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 05 05:22:03 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 05 05:22:03 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 05 05:22:03 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 05 05:22:03 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 05 05:22:03 localhost kernel: RPC: Registered udp transport module.
Dec 05 05:22:03 localhost kernel: RPC: Registered tcp transport module.
Dec 05 05:22:03 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 05 05:22:03 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 05 05:22:03 localhost rpc.statd[411]: Version 2.5.4 starting
Dec 05 05:22:03 localhost rpc.statd[411]: Initializing NSM state
Dec 05 05:22:03 localhost rpc.idmapd[416]: Setting log level to 0
Dec 05 05:22:03 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 05 05:22:03 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 05:22:03 localhost systemd-udevd[429]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 05:22:03 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 05:22:03 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 05 05:22:03 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 05 05:22:03 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 05:22:03 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 05 05:22:03 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 05:22:03 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 05:22:03 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 05:22:03 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 05:22:03 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 05:22:03 localhost systemd[1]: Reached target Network.
Dec 05 05:22:03 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 05 05:22:03 localhost systemd[1]: Starting dracut initqueue hook...
Dec 05 05:22:03 localhost kernel: virtio_blk virtio2: 4/0/0 default/read/poll queues
Dec 05 05:22:03 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 05 05:22:03 localhost kernel:  vda: vda1
Dec 05 05:22:03 localhost systemd-udevd[432]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:22:03 localhost kernel: libata version 3.00 loaded.
Dec 05 05:22:03 localhost kernel: ahci 0000:00:1f.2: version 3.0
Dec 05 05:22:03 localhost kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16
Dec 05 05:22:03 localhost kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode
Dec 05 05:22:03 localhost kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f)
Dec 05 05:22:03 localhost kernel: ahci 0000:00:1f.2: flags: 64bit ncq only 
Dec 05 05:22:03 localhost kernel: scsi host0: ahci
Dec 05 05:22:03 localhost kernel: scsi host1: ahci
Dec 05 05:22:03 localhost kernel: scsi host2: ahci
Dec 05 05:22:03 localhost kernel: scsi host3: ahci
Dec 05 05:22:03 localhost kernel: scsi host4: ahci
Dec 05 05:22:03 localhost kernel: scsi host5: ahci
Dec 05 05:22:03 localhost kernel: ata1: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22100 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost kernel: ata2: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22180 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost kernel: ata3: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22200 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost kernel: ata4: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22280 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost kernel: ata5: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22300 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost kernel: ata6: SATA max UDMA/133 abar m4096@0xfea22000 port 0xfea22380 irq 49 lpm-pol 0
Dec 05 05:22:03 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 05:22:03 localhost systemd[1]: Reached target Initrd Root Device.
Dec 05 05:22:03 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 05 05:22:03 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 05 05:22:03 localhost systemd[1]: Reached target System Initialization.
Dec 05 05:22:03 localhost systemd[1]: Reached target Basic System.
Dec 05 05:22:03 localhost kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300)
Dec 05 05:22:03 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 05 05:22:03 localhost kernel: ata1.00: applying bridge limits
Dec 05 05:22:03 localhost kernel: ata1.00: configured for UDMA/100
Dec 05 05:22:03 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 05 05:22:03 localhost kernel: ata6: SATA link down (SStatus 0 SControl 300)
Dec 05 05:22:03 localhost kernel: ata4: SATA link down (SStatus 0 SControl 300)
Dec 05 05:22:03 localhost kernel: ata2: SATA link down (SStatus 0 SControl 300)
Dec 05 05:22:03 localhost kernel: ata3: SATA link down (SStatus 0 SControl 300)
Dec 05 05:22:03 localhost kernel: ata5: SATA link down (SStatus 0 SControl 300)
Dec 05 05:22:03 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 05 05:22:03 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 05 05:22:03 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 05 05:22:03 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 05 05:22:04 localhost systemd[1]: Finished dracut initqueue hook.
Dec 05 05:22:04 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 05:22:04 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 05 05:22:04 localhost systemd[1]: Reached target Remote File Systems.
Dec 05 05:22:04 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 05 05:22:04 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 05 05:22:04 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 05 05:22:04 localhost systemd-fsck[524]: /usr/sbin/fsck.xfs: XFS file system.
Dec 05 05:22:04 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 05 05:22:04 localhost systemd[1]: Mounting /sysroot...
Dec 05 05:22:04 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 05 05:22:04 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 05 05:22:04 localhost kernel: XFS (vda1): Ending clean mount
Dec 05 05:22:04 localhost systemd[1]: Mounted /sysroot.
Dec 05 05:22:04 localhost systemd[1]: Reached target Initrd Root File System.
Dec 05 05:22:04 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 05 05:22:04 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 05 05:22:04 localhost systemd[1]: Reached target Initrd File Systems.
Dec 05 05:22:04 localhost systemd[1]: Reached target Initrd Default Target.
Dec 05 05:22:04 localhost systemd[1]: Starting dracut mount hook...
Dec 05 05:22:04 localhost systemd[1]: Finished dracut mount hook.
Dec 05 05:22:04 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 05 05:22:04 localhost rpc.idmapd[416]: exiting on signal 15
Dec 05 05:22:04 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 05 05:22:04 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 05 05:22:04 localhost systemd[1]: Stopped target Network.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Timer Units.
Dec 05 05:22:04 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 05 05:22:04 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Basic System.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Path Units.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Remote File Systems.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Slice Units.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Socket Units.
Dec 05 05:22:04 localhost systemd[1]: Stopped target System Initialization.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Local File Systems.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Swaps.
Dec 05 05:22:04 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut mount hook.
Dec 05 05:22:04 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 05 05:22:04 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 05 05:22:04 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 05 05:22:04 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 05 05:22:04 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 05 05:22:04 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 05 05:22:04 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 05 05:22:04 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 05 05:22:04 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 05 05:22:04 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 05 05:22:04 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 05 05:22:04 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 05 05:22:04 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Closed udev Control Socket.
Dec 05 05:22:04 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Closed udev Kernel Socket.
Dec 05 05:22:04 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 05 05:22:04 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 05 05:22:04 localhost systemd[1]: Starting Cleanup udev Database...
Dec 05 05:22:04 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 05 05:22:04 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 05 05:22:04 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Stopped Create System Users.
Dec 05 05:22:04 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 05 05:22:04 localhost systemd[1]: Finished Cleanup udev Database.
Dec 05 05:22:04 localhost systemd[1]: Reached target Switch Root.
Dec 05 05:22:04 localhost systemd[1]: Starting Switch Root...
Dec 05 05:22:04 localhost systemd[1]: Switching root.
Dec 05 05:22:04 localhost systemd-journald[281]: Journal stopped
Dec 05 05:22:05 localhost systemd-journald[281]: Received SIGTERM from PID 1 (systemd).
Dec 05 05:22:05 localhost kernel: audit: type=1404 audit(1764912124.956:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability open_perms=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:22:05 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:22:05 localhost kernel: audit: type=1403 audit(1764912125.070:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 05 05:22:05 localhost systemd[1]: Successfully loaded SELinux policy in 117.047ms.
Dec 05 05:22:05 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 22.578ms.
Dec 05 05:22:05 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 05 05:22:05 localhost systemd[1]: Detected virtualization kvm.
Dec 05 05:22:05 localhost systemd[1]: Detected architecture x86-64.
Dec 05 05:22:05 localhost systemd-rc-local-generator[607]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:22:05 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Stopped Switch Root.
Dec 05 05:22:05 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 05 05:22:05 localhost systemd[1]: Created slice Slice /system/getty.
Dec 05 05:22:05 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 05 05:22:05 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 05 05:22:05 localhost systemd[1]: Created slice User and Session Slice.
Dec 05 05:22:05 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 05 05:22:05 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 05 05:22:05 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 05 05:22:05 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 05 05:22:05 localhost systemd[1]: Stopped target Switch Root.
Dec 05 05:22:05 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 05 05:22:05 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 05 05:22:05 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 05 05:22:05 localhost systemd[1]: Reached target Path Units.
Dec 05 05:22:05 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 05 05:22:05 localhost systemd[1]: Reached target Slice Units.
Dec 05 05:22:05 localhost systemd[1]: Reached target Swaps.
Dec 05 05:22:05 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 05 05:22:05 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 05 05:22:05 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 05 05:22:05 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 05 05:22:05 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 05 05:22:05 localhost systemd[1]: Listening on udev Control Socket.
Dec 05 05:22:05 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 05 05:22:05 localhost systemd[1]: Mounting Huge Pages File System...
Dec 05 05:22:05 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 05 05:22:05 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 05 05:22:05 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 05 05:22:05 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 05:22:05 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 05 05:22:05 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 05:22:05 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 05 05:22:05 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 05 05:22:05 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 05 05:22:05 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 05 05:22:05 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 05 05:22:05 localhost systemd[1]: Stopped Journal Service.
Dec 05 05:22:05 localhost systemd[1]: Starting Journal Service...
Dec 05 05:22:05 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 05 05:22:05 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 05 05:22:05 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 05:22:05 localhost kernel: fuse: init (API version 7.37)
Dec 05 05:22:05 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 05 05:22:05 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 05 05:22:05 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 05 05:22:05 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 05 05:22:05 localhost systemd[1]: Mounted Huge Pages File System.
Dec 05 05:22:05 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 05 05:22:05 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 05 05:22:05 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 05 05:22:05 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 05 05:22:05 localhost systemd-journald[648]: Journal started
Dec 05 05:22:05 localhost systemd-journald[648]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 05:22:05 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 05 05:22:05 localhost systemd[1]: Started Journal Service.
Dec 05 05:22:05 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 05 05:22:05 localhost kernel: ACPI: bus type drm_connector registered
Dec 05 05:22:05 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 05:22:05 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 05 05:22:05 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 05 05:22:05 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 05 05:22:05 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 05 05:22:05 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 05 05:22:05 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 05 05:22:05 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 05 05:22:05 localhost systemd[1]: Mounting FUSE Control File System...
Dec 05 05:22:05 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 05:22:05 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 05 05:22:05 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 05 05:22:05 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 05 05:22:05 localhost systemd-journald[648]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 05 05:22:05 localhost systemd-journald[648]: Received client request to flush runtime journal.
Dec 05 05:22:05 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 05 05:22:05 localhost systemd[1]: Starting Create System Users...
Dec 05 05:22:05 localhost systemd[1]: Mounted FUSE Control File System.
Dec 05 05:22:05 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 05 05:22:05 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 05 05:22:05 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 05 05:22:05 localhost systemd[1]: Finished Create System Users.
Dec 05 05:22:05 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 05 05:22:05 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 05 05:22:05 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 05 05:22:05 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 05 05:22:05 localhost systemd[1]: Reached target Local File Systems.
Dec 05 05:22:05 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 05 05:22:05 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 05 05:22:05 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 05 05:22:05 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 05 05:22:05 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 05 05:22:05 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 05 05:22:05 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 05 05:22:05 localhost bootctl[666]: Couldn't find EFI system partition, skipping.
Dec 05 05:22:05 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 05 05:22:05 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 05 05:22:05 localhost systemd[1]: Starting Security Auditing Service...
Dec 05 05:22:05 localhost systemd[1]: Starting RPC Bind...
Dec 05 05:22:05 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 05 05:22:05 localhost auditd[672]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 05 05:22:05 localhost auditd[672]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 05 05:22:05 localhost systemd[1]: Started RPC Bind.
Dec 05 05:22:05 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 05 05:22:05 localhost augenrules[677]: /sbin/augenrules: No change
Dec 05 05:22:05 localhost augenrules[692]: No rules
Dec 05 05:22:05 localhost augenrules[692]: enabled 1
Dec 05 05:22:05 localhost augenrules[692]: failure 1
Dec 05 05:22:05 localhost augenrules[692]: pid 672
Dec 05 05:22:05 localhost augenrules[692]: rate_limit 0
Dec 05 05:22:05 localhost augenrules[692]: backlog_limit 8192
Dec 05 05:22:05 localhost augenrules[692]: lost 0
Dec 05 05:22:05 localhost augenrules[692]: backlog 2
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time 60000
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time_actual 0
Dec 05 05:22:05 localhost augenrules[692]: enabled 1
Dec 05 05:22:05 localhost augenrules[692]: failure 1
Dec 05 05:22:05 localhost augenrules[692]: pid 672
Dec 05 05:22:05 localhost augenrules[692]: rate_limit 0
Dec 05 05:22:05 localhost augenrules[692]: backlog_limit 8192
Dec 05 05:22:05 localhost augenrules[692]: lost 0
Dec 05 05:22:05 localhost augenrules[692]: backlog 4
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time 60000
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time_actual 0
Dec 05 05:22:05 localhost augenrules[692]: enabled 1
Dec 05 05:22:05 localhost augenrules[692]: failure 1
Dec 05 05:22:05 localhost augenrules[692]: pid 672
Dec 05 05:22:05 localhost augenrules[692]: rate_limit 0
Dec 05 05:22:05 localhost augenrules[692]: backlog_limit 8192
Dec 05 05:22:05 localhost augenrules[692]: lost 0
Dec 05 05:22:05 localhost augenrules[692]: backlog 0
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time 60000
Dec 05 05:22:05 localhost augenrules[692]: backlog_wait_time_actual 0
Dec 05 05:22:05 localhost systemd[1]: Started Security Auditing Service.
Dec 05 05:22:05 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 05 05:22:05 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 05 05:22:05 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 05 05:22:05 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 05 05:22:05 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 05 05:22:05 localhost systemd[1]: Starting Update is Completed...
Dec 05 05:22:05 localhost systemd[1]: Finished Update is Completed.
Dec 05 05:22:05 localhost systemd-udevd[700]: Using default interface naming scheme 'rhel-9.0'.
Dec 05 05:22:05 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 05 05:22:05 localhost systemd[1]: Reached target System Initialization.
Dec 05 05:22:05 localhost systemd[1]: Started dnf makecache --timer.
Dec 05 05:22:05 localhost systemd[1]: Started Daily rotation of log files.
Dec 05 05:22:05 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 05 05:22:05 localhost systemd[1]: Reached target Timer Units.
Dec 05 05:22:05 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 05 05:22:05 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 05 05:22:05 localhost systemd[1]: Reached target Socket Units.
Dec 05 05:22:05 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 05 05:22:05 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 05:22:05 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 05 05:22:05 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 05 05:22:05 localhost systemd-udevd[708]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:22:05 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 05 05:22:05 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 05 05:22:06 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 05 05:22:06 localhost systemd[1]: Reached target Basic System.
Dec 05 05:22:06 localhost dbus-broker-lau[725]: Ready
Dec 05 05:22:06 localhost systemd[1]: Starting NTP client/server...
Dec 05 05:22:06 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 05 05:22:06 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 05 05:22:06 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 05 05:22:06 localhost systemd[1]: Started irqbalance daemon.
Dec 05 05:22:06 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 05 05:22:06 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 05:22:06 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 05:22:06 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 05:22:06 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 05 05:22:06 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 05 05:22:06 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 05 05:22:06 localhost systemd[1]: Starting User Login Management...
Dec 05 05:22:06 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 05 05:22:06 localhost chronyd[752]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 05:22:06 localhost chronyd[752]: Loaded 0 symmetric keys
Dec 05 05:22:06 localhost chronyd[752]: Using right/UTC timezone to obtain leap second data
Dec 05 05:22:06 localhost chronyd[752]: Loaded seccomp filter (level 2)
Dec 05 05:22:06 localhost systemd[1]: Started NTP client/server.
Dec 05 05:22:06 localhost systemd-logind[745]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 05:22:06 localhost systemd-logind[745]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 05:22:06 localhost systemd-logind[745]: New seat seat0.
Dec 05 05:22:06 localhost systemd[1]: Started User Login Management.
Dec 05 05:22:06 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 05 05:22:06 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 05 05:22:06 localhost kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt
Dec 05 05:22:06 localhost kernel: lpc_ich 0000:00:1f.0: I/O space for GPIO uninitialized
Dec 05 05:22:06 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 05 05:22:06 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 05 05:22:06 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 05 05:22:06 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:01.0
Dec 05 05:22:06 localhost kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console
Dec 05 05:22:06 localhost kernel: Console: switching to colour dummy device 80x25
Dec 05 05:22:06 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 05 05:22:06 localhost kernel: [drm] features: -context_init
Dec 05 05:22:06 localhost kernel: [drm] number of scanouts: 1
Dec 05 05:22:06 localhost kernel: [drm] number of cap sets: 0
Dec 05 05:22:06 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0
Dec 05 05:22:06 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 05 05:22:06 localhost kernel: Console: switching to colour frame buffer device 160x50
Dec 05 05:22:06 localhost kernel: iTCO_vendor_support: vendor-support=0
Dec 05 05:22:06 localhost kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 05 05:22:06 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: Found a ICH9 TCO device (Version=2, TCOBASE=0x0660)
Dec 05 05:22:06 localhost kernel: iTCO_wdt iTCO_wdt.1.auto: initialized. heartbeat=30 sec (nowayout=0)
Dec 05 05:22:06 localhost kernel: kvm_amd: TSC scaling supported
Dec 05 05:22:06 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 05 05:22:06 localhost kernel: kvm_amd: Nested Paging enabled
Dec 05 05:22:06 localhost kernel: kvm_amd: LBR virtualization supported
Dec 05 05:22:06 localhost kernel: kvm_amd: Virtual VMLOAD VMSAVE supported
Dec 05 05:22:06 localhost kernel: kvm_amd: Virtual GIF supported
Dec 05 05:22:06 localhost iptables.init[737]: iptables: Applying firewall rules: [  OK  ]
Dec 05 05:22:06 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 05 05:22:06 localhost cloud-init[792]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 05 Dec 2025 05:22:06 +0000. Up 5.10 seconds.
Dec 05 05:22:06 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 05 05:22:06 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 05 05:22:06 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpxvkyhs4q.mount: Deactivated successfully.
Dec 05 05:22:06 localhost systemd[1]: Starting Hostname Service...
Dec 05 05:22:06 localhost systemd[1]: Started Hostname Service.
Dec 05 05:22:06 np0005546356 systemd-hostnamed[806]: Hostname set to <np0005546356> (static)
Dec 05 05:22:06 np0005546356 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 05 05:22:06 np0005546356 systemd[1]: Reached target Preparation for Network.
Dec 05 05:22:06 np0005546356 systemd[1]: Starting Network Manager...
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9215] NetworkManager (version 1.54.1-1.el9) is starting... (boot:81faa267-a78f-40df-a39b-c2f64c67c1ec)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9218] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9318] manager[0x55c17b026080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9350] hostname: hostname: using hostnamed
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9350] hostname: static hostname changed from (none) to "np0005546356"
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9353] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9437] manager[0x55c17b026080]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9438] manager[0x55c17b026080]: rfkill: WWAN hardware radio set enabled
Dec 05 05:22:06 np0005546356 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9479] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9479] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9479] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9480] manager: Networking is enabled by state file
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9481] settings: Loaded settings plugin: keyfile (internal)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9499] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9526] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9542] dhcp: init: Using DHCP client 'internal'
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9545] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9560] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9572] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9580] device (lo): Activation: starting connection 'lo' (fab12bac-354a-4d96-acbd-38603c43f0c0)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9590] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9595] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9624] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9635] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9637] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9638] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9640] device (eth0): carrier: link connected
Dec 05 05:22:06 np0005546356 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9644] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 systemd[1]: Started Network Manager.
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9657] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 05:22:06 np0005546356 systemd[1]: Reached target Network.
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9670] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9675] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9677] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9682] manager: NetworkManager state is now CONNECTING
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9685] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:22:06 np0005546356 systemd[1]: Starting Network Manager Wait Online...
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9692] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9699] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9704] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9737] dhcp4 (eth0): state changed new lease, address=192.168.25.227
Dec 05 05:22:06 np0005546356 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9758] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 05:22:06 np0005546356 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9864] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9866] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 05:22:06 np0005546356 NetworkManager[810]: <info>  [1764912126.9870] device (lo): Activation: successful, device activated.
Dec 05 05:22:06 np0005546356 systemd[1]: Started GSSAPI Proxy Daemon.
Dec 05 05:22:06 np0005546356 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 05 05:22:06 np0005546356 systemd[1]: Reached target NFS client services.
Dec 05 05:22:06 np0005546356 systemd[1]: Reached target Preparation for Remote File Systems.
Dec 05 05:22:06 np0005546356 systemd[1]: Reached target Remote File Systems.
Dec 05 05:22:06 np0005546356 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 05 05:22:08 np0005546356 NetworkManager[810]: <info>  [1764912128.7924] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:22:09 np0005546356 NetworkManager[810]: <info>  [1764912129.8795] dhcp6 (eth0): state changed new lease, address=2001:db8::242
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0334] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0367] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0368] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0372] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0374] device (eth0): Activation: successful, device activated.
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0378] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 05:22:11 np0005546356 NetworkManager[810]: <info>  [1764912131.0380] manager: startup complete
Dec 05 05:22:11 np0005546356 systemd[1]: Finished Network Manager Wait Online.
Dec 05 05:22:11 np0005546356 systemd[1]: Starting Cloud-init: Network Stage...
Dec 05 05:22:11 np0005546356 cloud-init[877]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 05 Dec 2025 05:22:11 +0000. Up 9.86 seconds.
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |  eth0  | True |        192.168.25.227        | 255.255.255.0 | global | fa:16:3e:06:1a:18 |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |  eth0  | True |      2001:db8::242/128       |       .       | global | fa:16:3e:06:1a:18 |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |  eth0  | True | fe80::f816:3eff:fe06:1a18/64 |       .       |  link  | fa:16:3e:06:1a:18 |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: | Route |   Destination   |   Gateway    |     Genmask     | Interface | Flags |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   0   |     0.0.0.0     | 192.168.25.1 |     0.0.0.0     |    eth0   |   UG  |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   1   | 169.254.169.254 | 192.168.25.2 | 255.255.255.255 |    eth0   |  UGH  |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   2   |   192.168.25.0  |   0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+-----------------+--------------+-----------------+-----------+-------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: ++++++++++++++++++++++Route IPv6 info++++++++++++++++++++++
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: | Route |  Destination  |   Gateway   | Interface | Flags |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   1   |  2001:db8::1  |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   2   | 2001:db8::242 |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   3   |   fe80::/64   |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   4   |      ::/0     | 2001:db8::1 |    eth0   |   UG  |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   6   |     local     |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   7   |     local     |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: |   8   |   multicast   |      ::     |    eth0   |   U   |
Dec 05 05:22:11 np0005546356 cloud-init[877]: ci-info: +-------+---------------+-------------+-----------+-------+
Dec 05 05:22:11 np0005546356 useradd[944]: new group: name=cloud-user, GID=1001
Dec 05 05:22:11 np0005546356 useradd[944]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 05 05:22:11 np0005546356 useradd[944]: add 'cloud-user' to group 'adm'
Dec 05 05:22:11 np0005546356 useradd[944]: add 'cloud-user' to group 'systemd-journal'
Dec 05 05:22:11 np0005546356 useradd[944]: add 'cloud-user' to shadow group 'adm'
Dec 05 05:22:11 np0005546356 useradd[944]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 05 05:22:12 np0005546356 cloud-init[877]: Generating public/private rsa key pair.
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key fingerprint is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: SHA256:TBaumN+C+lPAAON84NxtKaaK0jEaZ/uEBqhg5S9zCTk root@np0005546356
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key's randomart image is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: +---[RSA 3072]----+
Dec 05 05:22:12 np0005546356 cloud-init[877]: |oo      .        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |=.+ . .. .       |
Dec 05 05:22:12 np0005546356 cloud-init[877]: | = O +  +        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |. * *o =         |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |=o=Eo.. S        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |*B =+oo.         |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |* =oo=o .        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |.. +=  .         |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |  ..o.           |
Dec 05 05:22:12 np0005546356 cloud-init[877]: +----[SHA256]-----+
Dec 05 05:22:12 np0005546356 cloud-init[877]: Generating public/private ecdsa key pair.
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key fingerprint is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: SHA256:B0mZZb84fgBBTH4QJysApqqBPqST6CITcpeDotd90cE root@np0005546356
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key's randomart image is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: +---[ECDSA 256]---+
Dec 05 05:22:12 np0005546356 cloud-init[877]: |  o..  +O=+      |
Dec 05 05:22:12 np0005546356 cloud-init[877]: | o   . o+O .     |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |.     . *.. .    |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |o      . +E. .   |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |+. . .  S.=..    |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |Xoo +   .o.o     |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |O* o o   .. .    |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |*.o . . .  .     |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |o+     .         |
Dec 05 05:22:12 np0005546356 cloud-init[877]: +----[SHA256]-----+
Dec 05 05:22:12 np0005546356 cloud-init[877]: Generating public/private ed25519 key pair.
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 05 05:22:12 np0005546356 cloud-init[877]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key fingerprint is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: SHA256:KXEPIzY1LrCtBuNHxpq7CDBhdynTgzfcYYoKjvZFAhk root@np0005546356
Dec 05 05:22:12 np0005546356 cloud-init[877]: The key's randomart image is:
Dec 05 05:22:12 np0005546356 cloud-init[877]: +--[ED25519 256]--+
Dec 05 05:22:12 np0005546356 cloud-init[877]: | Eo .  oo        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: | ...=+=o..       |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |o.oB=@B.=        |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |=ooBBoo* =       |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |+++ +.. S .      |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |o..+.  .         |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |. ..             |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |.. .             |
Dec 05 05:22:12 np0005546356 cloud-init[877]: |. .              |
Dec 05 05:22:12 np0005546356 cloud-init[877]: +----[SHA256]-----+
Dec 05 05:22:12 np0005546356 systemd[1]: Finished Cloud-init: Network Stage.
Dec 05 05:22:12 np0005546356 systemd[1]: Reached target Cloud-config availability.
Dec 05 05:22:12 np0005546356 systemd[1]: Reached target Network is Online.
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Cloud-init: Config Stage...
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Crash recovery kernel arming...
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Notify NFS peers of a restart...
Dec 05 05:22:12 np0005546356 sm-notify[960]: Version 2.5.4 starting
Dec 05 05:22:12 np0005546356 systemd[1]: Starting System Logging Service...
Dec 05 05:22:12 np0005546356 systemd[1]: Starting OpenSSH server daemon...
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Permit User Sessions...
Dec 05 05:22:12 np0005546356 systemd[1]: Started Notify NFS peers of a restart.
Dec 05 05:22:12 np0005546356 sshd[962]: Server listening on 0.0.0.0 port 22.
Dec 05 05:22:12 np0005546356 sshd[962]: Server listening on :: port 22.
Dec 05 05:22:12 np0005546356 systemd[1]: Started OpenSSH server daemon.
Dec 05 05:22:12 np0005546356 systemd[1]: Finished Permit User Sessions.
Dec 05 05:22:12 np0005546356 systemd[1]: Started Command Scheduler.
Dec 05 05:22:12 np0005546356 rsyslogd[961]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="961" x-info="https://www.rsyslog.com"] start
Dec 05 05:22:12 np0005546356 rsyslogd[961]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 05 05:22:12 np0005546356 crond[968]: (CRON) STARTUP (1.5.7)
Dec 05 05:22:12 np0005546356 crond[968]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 05 05:22:12 np0005546356 crond[968]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 35% if used.)
Dec 05 05:22:12 np0005546356 crond[968]: (CRON) INFO (running with inotify support)
Dec 05 05:22:12 np0005546356 systemd[1]: Started Getty on tty1.
Dec 05 05:22:12 np0005546356 systemd[1]: Started Serial Getty on ttyS0.
Dec 05 05:22:12 np0005546356 systemd[1]: Reached target Login Prompts.
Dec 05 05:22:12 np0005546356 systemd[1]: Started System Logging Service.
Dec 05 05:22:12 np0005546356 systemd[1]: Reached target Multi-User System.
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 05 05:22:12 np0005546356 sshd-session[978]: Unable to negotiate with 192.168.25.11 port 53264: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 05 05:22:12 np0005546356 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 05 05:22:12 np0005546356 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 05 05:22:12 np0005546356 sshd-session[990]: Unable to negotiate with 192.168.25.11 port 53294: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 05 05:22:12 np0005546356 sshd-session[996]: Unable to negotiate with 192.168.25.11 port 53306: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 05 05:22:12 np0005546356 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 05:22:12 np0005546356 sshd-session[1013]: Unable to negotiate with 192.168.25.11 port 53326: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 05 05:22:12 np0005546356 sshd-session[1019]: Unable to negotiate with 192.168.25.11 port 53340: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 05 05:22:12 np0005546356 sshd-session[965]: Connection closed by 192.168.25.11 port 53254 [preauth]
Dec 05 05:22:12 np0005546356 sshd-session[986]: Connection closed by 192.168.25.11 port 53280 [preauth]
Dec 05 05:22:12 np0005546356 kdumpctl[973]: kdump: No kdump initial ramdisk found.
Dec 05 05:22:12 np0005546356 kdumpctl[973]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 05 05:22:12 np0005546356 sshd-session[999]: Connection closed by 192.168.25.11 port 53308 [preauth]
Dec 05 05:22:12 np0005546356 sshd-session[1005]: Connection closed by 192.168.25.11 port 53320 [preauth]
Dec 05 05:22:12 np0005546356 cloud-init[1087]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 05 Dec 2025 05:22:12 +0000. Up 11.28 seconds.
Dec 05 05:22:12 np0005546356 systemd[1]: Finished Cloud-init: Config Stage.
Dec 05 05:22:12 np0005546356 systemd[1]: Starting Cloud-init: Final Stage...
Dec 05 05:22:12 np0005546356 dracut[1239]: dracut-057-102.git20250818.el9
Dec 05 05:22:13 np0005546356 cloud-init[1257]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 05 Dec 2025 05:22:13 +0000. Up 11.62 seconds.
Dec 05 05:22:13 np0005546356 dracut[1241]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 05 05:22:13 np0005546356 cloud-init[1269]: #############################################################
Dec 05 05:22:13 np0005546356 cloud-init[1273]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 05 05:22:13 np0005546356 cloud-init[1279]: 256 SHA256:B0mZZb84fgBBTH4QJysApqqBPqST6CITcpeDotd90cE root@np0005546356 (ECDSA)
Dec 05 05:22:13 np0005546356 cloud-init[1284]: 256 SHA256:KXEPIzY1LrCtBuNHxpq7CDBhdynTgzfcYYoKjvZFAhk root@np0005546356 (ED25519)
Dec 05 05:22:13 np0005546356 cloud-init[1288]: 3072 SHA256:TBaumN+C+lPAAON84NxtKaaK0jEaZ/uEBqhg5S9zCTk root@np0005546356 (RSA)
Dec 05 05:22:13 np0005546356 cloud-init[1290]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 05 05:22:13 np0005546356 cloud-init[1295]: #############################################################
Dec 05 05:22:13 np0005546356 cloud-init[1257]: Cloud-init v. 24.4-7.el9 finished at Fri, 05 Dec 2025 05:22:13 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.77 seconds
Dec 05 05:22:13 np0005546356 systemd[1]: Finished Cloud-init: Final Stage.
Dec 05 05:22:13 np0005546356 systemd[1]: Reached target Cloud-init target.
Dec 05 05:22:13 np0005546356 chronyd[752]: Selected source 23.186.168.132 (2.centos.pool.ntp.org)
Dec 05 05:22:13 np0005546356 chronyd[752]: System clock TAI offset set to 37 seconds
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 05 05:22:13 np0005546356 dracut[1241]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 05 05:22:13 np0005546356 dracut[1241]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 05 05:22:13 np0005546356 dracut[1241]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 05 05:22:13 np0005546356 dracut[1241]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: memstrack is not available
Dec 05 05:22:14 np0005546356 dracut[1241]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 05 05:22:14 np0005546356 dracut[1241]: memstrack is not available
Dec 05 05:22:14 np0005546356 dracut[1241]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 05 05:22:14 np0005546356 dracut[1241]: *** Including module: systemd ***
Dec 05 05:22:14 np0005546356 dracut[1241]: *** Including module: fips ***
Dec 05 05:22:14 np0005546356 dracut[1241]: *** Including module: systemd-initrd ***
Dec 05 05:22:14 np0005546356 dracut[1241]: *** Including module: i18n ***
Dec 05 05:22:14 np0005546356 dracut[1241]: *** Including module: drm ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: prefixdevname ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: kernel-modules ***
Dec 05 05:22:15 np0005546356 kernel: block vda: the capability attribute has been deprecated.
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: kernel-modules-extra ***
Dec 05 05:22:15 np0005546356 dracut[1241]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 05 05:22:15 np0005546356 dracut[1241]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 05 05:22:15 np0005546356 dracut[1241]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 05 05:22:15 np0005546356 dracut[1241]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: qemu ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: fstab-sys ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: rootfs-block ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: terminfo ***
Dec 05 05:22:15 np0005546356 dracut[1241]: *** Including module: udev-rules ***
Dec 05 05:22:16 np0005546356 dracut[1241]: Skipping udev rule: 91-permissions.rules
Dec 05 05:22:16 np0005546356 dracut[1241]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: virtiofs ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: dracut-systemd ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: usrmount ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: base ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: fs-lib ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: kdumpbase ***
Dec 05 05:22:16 np0005546356 irqbalance[741]: Cannot change IRQ 45 affinity: Operation not permitted
Dec 05 05:22:16 np0005546356 irqbalance[741]: IRQ 45 affinity is now unmanaged
Dec 05 05:22:16 np0005546356 irqbalance[741]: Cannot change IRQ 44 affinity: Operation not permitted
Dec 05 05:22:16 np0005546356 irqbalance[741]: IRQ 44 affinity is now unmanaged
Dec 05 05:22:16 np0005546356 irqbalance[741]: Cannot change IRQ 42 affinity: Operation not permitted
Dec 05 05:22:16 np0005546356 irqbalance[741]: IRQ 42 affinity is now unmanaged
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 05 05:22:16 np0005546356 dracut[1241]:   microcode_ctl module: mangling fw_dir
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 05 05:22:16 np0005546356 dracut[1241]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: openssl ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: shutdown ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including module: squash ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Including modules done ***
Dec 05 05:22:16 np0005546356 dracut[1241]: *** Installing kernel module dependencies ***
Dec 05 05:22:17 np0005546356 dracut[1241]: *** Installing kernel module dependencies done ***
Dec 05 05:22:17 np0005546356 dracut[1241]: *** Resolving executable dependencies ***
Dec 05 05:22:18 np0005546356 dracut[1241]: *** Resolving executable dependencies done ***
Dec 05 05:22:18 np0005546356 dracut[1241]: *** Generating early-microcode cpio image ***
Dec 05 05:22:18 np0005546356 dracut[1241]: *** Store current command line parameters ***
Dec 05 05:22:18 np0005546356 dracut[1241]: Stored kernel commandline:
Dec 05 05:22:18 np0005546356 dracut[1241]: No dracut internal kernel commandline stored in the initramfs
Dec 05 05:22:18 np0005546356 dracut[1241]: *** Install squash loader ***
Dec 05 05:22:19 np0005546356 dracut[1241]: *** Squashing the files inside the initramfs ***
Dec 05 05:22:20 np0005546356 dracut[1241]: *** Squashing the files inside the initramfs done ***
Dec 05 05:22:20 np0005546356 dracut[1241]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 05 05:22:20 np0005546356 dracut[1241]: *** Hardlinking files ***
Dec 05 05:22:20 np0005546356 dracut[1241]: Mode:           real
Dec 05 05:22:20 np0005546356 dracut[1241]: Files:          50
Dec 05 05:22:20 np0005546356 dracut[1241]: Linked:         0 files
Dec 05 05:22:20 np0005546356 dracut[1241]: Compared:       0 xattrs
Dec 05 05:22:20 np0005546356 dracut[1241]: Compared:       0 files
Dec 05 05:22:20 np0005546356 dracut[1241]: Saved:          0 B
Dec 05 05:22:20 np0005546356 dracut[1241]: Duration:       0.000410 seconds
Dec 05 05:22:20 np0005546356 dracut[1241]: *** Hardlinking files done ***
Dec 05 05:22:20 np0005546356 dracut[1241]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 05 05:22:21 np0005546356 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:22:21 np0005546356 kdumpctl[973]: kdump: kexec: loaded kdump kernel
Dec 05 05:22:21 np0005546356 kdumpctl[973]: kdump: Starting kdump: [OK]
Dec 05 05:22:21 np0005546356 systemd[1]: Finished Crash recovery kernel arming.
Dec 05 05:22:21 np0005546356 systemd[1]: Startup finished in 1.331s (kernel) + 2.195s (initrd) + 16.197s (userspace) = 19.724s.
Dec 05 05:22:36 np0005546356 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 05:23:18 np0005546356 chronyd[752]: Selected source 172.235.32.243 (2.centos.pool.ntp.org)
Dec 05 05:23:41 np0005546356 sshd-session[4369]: Accepted publickey for zuul from 192.168.25.12 port 54812 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 05 05:23:41 np0005546356 systemd[1]: Created slice User Slice of UID 1000.
Dec 05 05:23:41 np0005546356 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 05 05:23:41 np0005546356 systemd-logind[745]: New session 1 of user zuul.
Dec 05 05:23:41 np0005546356 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 05 05:23:41 np0005546356 systemd[1]: Starting User Manager for UID 1000...
Dec 05 05:23:41 np0005546356 systemd[4373]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:23:41 np0005546356 systemd[4373]: Queued start job for default target Main User Target.
Dec 05 05:23:41 np0005546356 systemd[4373]: Created slice User Application Slice.
Dec 05 05:23:41 np0005546356 systemd[4373]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 05:23:41 np0005546356 systemd[4373]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 05:23:41 np0005546356 systemd[4373]: Reached target Paths.
Dec 05 05:23:41 np0005546356 systemd[4373]: Reached target Timers.
Dec 05 05:23:41 np0005546356 systemd[4373]: Starting D-Bus User Message Bus Socket...
Dec 05 05:23:41 np0005546356 systemd[4373]: Starting Create User's Volatile Files and Directories...
Dec 05 05:23:41 np0005546356 systemd[4373]: Listening on D-Bus User Message Bus Socket.
Dec 05 05:23:41 np0005546356 systemd[4373]: Reached target Sockets.
Dec 05 05:23:41 np0005546356 systemd[4373]: Finished Create User's Volatile Files and Directories.
Dec 05 05:23:41 np0005546356 systemd[4373]: Reached target Basic System.
Dec 05 05:23:41 np0005546356 systemd[4373]: Reached target Main User Target.
Dec 05 05:23:41 np0005546356 systemd[4373]: Startup finished in 76ms.
Dec 05 05:23:41 np0005546356 systemd[1]: Started User Manager for UID 1000.
Dec 05 05:23:41 np0005546356 systemd[1]: Started Session 1 of User zuul.
Dec 05 05:23:41 np0005546356 sshd-session[4369]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:23:42 np0005546356 python3[4455]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:23:43 np0005546356 python3[4483]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:23:48 np0005546356 python3[4537]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:23:49 np0005546356 python3[4577]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 05 05:23:50 np0005546356 python3[4603]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFHZ/BEKScYPqeuoMyJsqV2GWKwvzZ6mKmJtbDRHQ8ncFPKT1hjiJ9ytScyfAmzVnNQfVjdgq1QBqp6Hcte4k4avZUQmNp+82D50W3I/rf2bLOm7PoqBoGCeUTMCeo3DpBkjsHoJZ4ewNhK4Yj3U3JhL6wJ1a98cQ6GI+KC3n6OnC4nxIyRPet89lb7m38DUbHzTjXgk/sL5xS60grhun+/E62QgJQFvk6ub7xFUFm0DHc9aL2nLfU72oVNIM48nUp1nVWZTCxVsKcXyKofQrcx2blH7XbNZ/DaYxUtOD1V0FD+0suhU8QcEtSpL4juY86PGY5uynD9SIZx0nY9g70Zi43ZMdFk6RLVuDI+nkJaFBJ+ZkUosRGEelGtQZivXzx4W1yr0GukI33xL0yRqOgQcAjabZVtY4lgaRTlyuT2ZjTpt131KbkxE4286Zgcf48BoZ/5PcR+xysiGCOtnAL26OxY82I49HkhDzyO+zDv8zeX/Fx1dsvIjPPAHRXrU0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:23:50 np0005546356 python3[4627]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:51 np0005546356 python3[4726]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:23:51 np0005546356 python3[4797]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764912231.0980918-229-7829622249735/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7368a00ca3904157bd0e0b08b1687dd1_id_rsa follow=False checksum=09f27066728426aa12e3bde70892adc44f47934c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:51 np0005546356 python3[4920]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:23:52 np0005546356 python3[4991]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764912231.736771-273-71427696082948/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7368a00ca3904157bd0e0b08b1687dd1_id_rsa.pub follow=False checksum=f2c2f8f4a234a169abf252db5aed8cc9a63417fd backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:53 np0005546356 python3[5039]: ansible-ping Invoked with data=pong
Dec 05 05:23:53 np0005546356 python3[5063]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:23:55 np0005546356 python3[5117]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 05 05:23:55 np0005546356 python3[5149]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:55 np0005546356 python3[5173]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:56 np0005546356 python3[5197]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:56 np0005546356 python3[5221]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:56 np0005546356 python3[5245]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:56 np0005546356 python3[5269]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:57 np0005546356 sudo[5293]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmadhkgoobowalocefpzbcrmjaxdwyi ; /usr/bin/python3'
Dec 05 05:23:57 np0005546356 sudo[5293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:23:58 np0005546356 python3[5295]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:58 np0005546356 sudo[5293]: pam_unix(sudo:session): session closed for user root
Dec 05 05:23:58 np0005546356 sudo[5371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfsesvkienuyuxelxsyaoppdqpqrrep ; /usr/bin/python3'
Dec 05 05:23:58 np0005546356 sudo[5371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:23:58 np0005546356 python3[5373]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:23:58 np0005546356 sudo[5371]: pam_unix(sudo:session): session closed for user root
Dec 05 05:23:58 np0005546356 sudo[5444]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tutflxftnjjlhnnhwadwvoqqvhcgxcno ; /usr/bin/python3'
Dec 05 05:23:58 np0005546356 sudo[5444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:23:58 np0005546356 python3[5446]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764912238.1536467-26-98267616813362/source follow=False _original_basename=mirror_info.sh.j2 checksum=8d04605e615eb785450b583fc5efd2437794600d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:23:58 np0005546356 sudo[5444]: pam_unix(sudo:session): session closed for user root
Dec 05 05:23:59 np0005546356 python3[5494]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:23:59 np0005546356 python3[5518]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:23:59 np0005546356 python3[5542]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:23:59 np0005546356 python3[5566]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5590]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5614]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5638]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5662]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5686]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:00 np0005546356 python3[5710]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:01 np0005546356 python3[5734]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:01 np0005546356 python3[5758]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:01 np0005546356 python3[5782]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:01 np0005546356 python3[5806]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:01 np0005546356 python3[5830]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:02 np0005546356 python3[5854]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:02 np0005546356 python3[5878]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:02 np0005546356 python3[5902]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:02 np0005546356 python3[5926]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:02 np0005546356 python3[5950]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:03 np0005546356 python3[5974]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:03 np0005546356 python3[5998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:03 np0005546356 python3[6022]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:03 np0005546356 python3[6046]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:03 np0005546356 python3[6070]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:04 np0005546356 python3[6094]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:24:06 np0005546356 sudo[6118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axrruiybocahexjegtgyekfpbsjipkom ; /usr/bin/python3'
Dec 05 05:24:06 np0005546356 sudo[6118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:06 np0005546356 python3[6120]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 05:24:06 np0005546356 systemd[1]: Starting Time & Date Service...
Dec 05 05:24:06 np0005546356 systemd[1]: Started Time & Date Service.
Dec 05 05:24:06 np0005546356 systemd-timedated[6122]: Changed time zone to 'UTC' (UTC).
Dec 05 05:24:06 np0005546356 sudo[6118]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:06 np0005546356 sudo[6149]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adljiyyjognlteilecaujzbnqxpvwdwa ; /usr/bin/python3'
Dec 05 05:24:06 np0005546356 sudo[6149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:07 np0005546356 python3[6151]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:07 np0005546356 sudo[6149]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:07 np0005546356 python3[6227]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:24:07 np0005546356 python3[6298]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764912247.1922994-202-186798288994165/source _original_basename=tmpdn3qgnm1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:07 np0005546356 python3[6398]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:24:08 np0005546356 python3[6469]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764912247.765049-242-271106983505262/source _original_basename=tmp1ilatq_h follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:08 np0005546356 sudo[6569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atknbtzwerjdgkicfksgzhvotpkbjldd ; /usr/bin/python3'
Dec 05 05:24:08 np0005546356 sudo[6569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:08 np0005546356 python3[6571]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:24:08 np0005546356 sudo[6569]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:08 np0005546356 sudo[6642]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmspiponytxhbopdvhqffbfsxaxfidv ; /usr/bin/python3'
Dec 05 05:24:08 np0005546356 sudo[6642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:08 np0005546356 python3[6644]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764912248.525229-306-121874769682611/source _original_basename=tmpdgzuaqqy follow=False checksum=3376279f73d28facdd74ea12764d451a41993035 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:08 np0005546356 sudo[6642]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:09 np0005546356 python3[6692]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:24:09 np0005546356 python3[6718]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:24:09 np0005546356 sudo[6796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjqjnhprwnsvxcxmjcqrtwshcuhqgiv ; /usr/bin/python3'
Dec 05 05:24:09 np0005546356 sudo[6796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:09 np0005546356 python3[6798]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:24:09 np0005546356 sudo[6796]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:09 np0005546356 sudo[6869]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlfjvotnqmpxpatofpnciadapqoqeekr ; /usr/bin/python3'
Dec 05 05:24:09 np0005546356 sudo[6869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:10 np0005546356 python3[6871]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764912249.7011814-362-153155707235625/source _original_basename=tmp4q1hmyct follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:10 np0005546356 sudo[6869]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:10 np0005546356 sudo[6920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddazeeplevpbmzvvltbqdshjtcuyblxb ; /usr/bin/python3'
Dec 05 05:24:10 np0005546356 sudo[6920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:10 np0005546356 python3[6922]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e6f-3cad-abb7-2110-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:24:10 np0005546356 sudo[6920]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:11 np0005546356 python3[6950]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                             _uses_shell=True zuul_log_id=fa163e6f-3cad-abb7-2110-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 05 05:24:12 np0005546356 python3[6979]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:27 np0005546356 sudo[7003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyaptdblejzbtvhclhmjqlfnoxqomecv ; /usr/bin/python3'
Dec 05 05:24:27 np0005546356 sudo[7003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:24:27 np0005546356 python3[7005]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:24:27 np0005546356 sudo[7003]: pam_unix(sudo:session): session closed for user root
Dec 05 05:24:36 np0005546356 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: ROM [mem 0x00000000-0x0003ffff pref]
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: ROM [mem 0xfe000000-0xfe03ffff pref]: assigned
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfb600000-0xfb603fff 64bit pref]: assigned
Dec 05 05:24:53 np0005546356 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfe040000-0xfe040fff]: assigned
Dec 05 05:24:53 np0005546356 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002)
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8784] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 05:24:53 np0005546356 systemd-udevd[7008]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8904] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8925] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8929] device (eth1): carrier: link connected
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8932] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8937] policy: auto-activating connection 'Wired connection 1' (df66603f-7662-3e7d-b7d2-33281c48b328)
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8941] device (eth1): Activation: starting connection 'Wired connection 1' (df66603f-7662-3e7d-b7d2-33281c48b328)
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8942] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8946] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8953] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:24:53 np0005546356 NetworkManager[810]: <info>  [1764912293.8957] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:24:54 np0005546356 python3[7035]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e6f-3cad-55c2-35f6-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:25:01 np0005546356 sudo[7113]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzdpwyopiuhrzlduuamkjgsuvnglbacp ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Dec 05 05:25:01 np0005546356 sudo[7113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:25:01 np0005546356 python3[7115]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:25:01 np0005546356 sudo[7113]: pam_unix(sudo:session): session closed for user root
Dec 05 05:25:01 np0005546356 sudo[7186]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmmxcjadiznuiwlerqhsztpszfrwkeka ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Dec 05 05:25:01 np0005546356 sudo[7186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:25:01 np0005546356 python3[7188]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764912300.9305046-112-222170946628618/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=6a4edaa101c9e9df9977da849739b42f59a68ebe backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:25:01 np0005546356 sudo[7186]: pam_unix(sudo:session): session closed for user root
Dec 05 05:25:01 np0005546356 sudo[7236]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktvidfrkmdjvwokteivaurjsmsmltnve ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Dec 05 05:25:01 np0005546356 sudo[7236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:25:01 np0005546356 python3[7238]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:25:01 np0005546356 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 05:25:01 np0005546356 systemd[1]: Stopped Network Manager Wait Online.
Dec 05 05:25:01 np0005546356 systemd[1]: Stopping Network Manager Wait Online...
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8966] caught SIGTERM, shutting down normally.
Dec 05 05:25:01 np0005546356 systemd[1]: Stopping Network Manager...
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8971] dhcp4 (eth0): canceled DHCP transaction
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8971] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8971] dhcp4 (eth0): state changed no lease
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8972] dhcp6 (eth0): canceled DHCP transaction
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8972] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8972] dhcp6 (eth0): state changed no lease
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.8974] manager: NetworkManager state is now CONNECTING
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.9039] dhcp4 (eth1): canceled DHCP transaction
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.9039] dhcp4 (eth1): state changed no lease
Dec 05 05:25:01 np0005546356 NetworkManager[810]: <info>  [1764912301.9057] exiting (success)
Dec 05 05:25:01 np0005546356 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:25:01 np0005546356 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:25:01 np0005546356 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 05:25:01 np0005546356 systemd[1]: Stopped Network Manager.
Dec 05 05:25:01 np0005546356 systemd[1]: Starting Network Manager...
Dec 05 05:25:01 np0005546356 NetworkManager[7250]: <info>  [1764912301.9474] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:81faa267-a78f-40df-a39b-c2f64c67c1ec)
Dec 05 05:25:01 np0005546356 NetworkManager[7250]: <info>  [1764912301.9475] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Dec 05 05:25:01 np0005546356 NetworkManager[7250]: <info>  [1764912301.9516] manager[0x55b7cf91d090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 05:25:01 np0005546356 systemd[1]: Starting Hostname Service...
Dec 05 05:25:02 np0005546356 systemd[1]: Started Hostname Service.
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0049] hostname: hostname: using hostnamed
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0050] hostname: static hostname changed from (none) to "np0005546356"
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0052] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0055] manager[0x55b7cf91d090]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0055] manager[0x55b7cf91d090]: rfkill: WWAN hardware radio set enabled
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0074] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0074] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0074] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0075] manager: Networking is enabled by state file
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0076] settings: Loaded settings plugin: keyfile (internal)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0079] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0096] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0102] dhcp: init: Using DHCP client 'internal'
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0104] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0108] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0112] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0117] device (lo): Activation: starting connection 'lo' (fab12bac-354a-4d96-acbd-38603c43f0c0)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0123] device (eth0): carrier: link connected
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0127] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0131] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0132] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0137] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0142] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0146] device (eth1): carrier: link connected
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0150] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0154] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (df66603f-7662-3e7d-b7d2-33281c48b328) (indicated)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0154] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0159] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0163] device (eth1): Activation: starting connection 'Wired connection 1' (df66603f-7662-3e7d-b7d2-33281c48b328)
Dec 05 05:25:02 np0005546356 systemd[1]: Started Network Manager.
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0170] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0180] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0182] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0184] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0185] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0187] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0189] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0191] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0192] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0197] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0199] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0201] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0203] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0208] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0211] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0230] dhcp4 (eth0): state changed new lease, address=192.168.25.227
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0237] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 05:25:02 np0005546356 systemd[1]: Starting Network Manager Wait Online...
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0319] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 sudo[7236]: pam_unix(sudo:session): session closed for user root
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0321] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 05:25:02 np0005546356 NetworkManager[7250]: <info>  [1764912302.0323] device (lo): Activation: successful, device activated.
Dec 05 05:25:02 np0005546356 python3[7310]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e6f-3cad-55c2-35f6-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0720] dhcp6 (eth0): state changed new lease, address=2001:db8::242
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0727] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0759] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0760] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0761] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0763] device (eth0): Activation: successful, device activated.
Dec 05 05:25:03 np0005546356 NetworkManager[7250]: <info>  [1764912303.0767] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 05:25:13 np0005546356 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:25:32 np0005546356 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 05:25:45 np0005546356 systemd[4373]: Starting Mark boot as successful...
Dec 05 05:25:45 np0005546356 systemd[4373]: Finished Mark boot as successful.
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4286] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 05:25:47 np0005546356 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:25:47 np0005546356 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4459] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4460] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4463] device (eth1): Activation: successful, device activated.
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4466] manager: startup complete
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4467] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <warn>  [1764912347.4469] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4473] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 systemd[1]: Finished Network Manager Wait Online.
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4518] dhcp4 (eth1): canceled DHCP transaction
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4519] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4519] dhcp4 (eth1): state changed no lease
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4527] policy: auto-activating connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e)
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4530] device (eth1): Activation: starting connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e)
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4530] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4532] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4536] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4542] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4559] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4560] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:25:47 np0005546356 NetworkManager[7250]: <info>  [1764912347.4564] device (eth1): Activation: successful, device activated.
Dec 05 05:25:55 np0005546356 sudo[7433]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxxluiuekxdzxqorqjwqipwojrethcwg ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Dec 05 05:25:55 np0005546356 sudo[7433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:25:55 np0005546356 python3[7435]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:25:55 np0005546356 sudo[7433]: pam_unix(sudo:session): session closed for user root
Dec 05 05:25:55 np0005546356 sudo[7506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bilszxkxsxyfkmlngjncdnurfqmaoyiq ; OS_CLOUD=ibm-bm3-nodepool /usr/bin/python3'
Dec 05 05:25:55 np0005546356 sudo[7506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:25:55 np0005546356 python3[7508]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764912355.41391-318-143903925086275/source _original_basename=tmpx4zfhu7y follow=False checksum=ba5ca9d984dd1036fa9accb4896bfd63068ae039 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:25:55 np0005546356 sudo[7506]: pam_unix(sudo:session): session closed for user root
Dec 05 05:25:57 np0005546356 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:26:55 np0005546356 sshd-session[4382]: Received disconnect from 192.168.25.12 port 54812:11: disconnected by user
Dec 05 05:26:55 np0005546356 sshd-session[4382]: Disconnected from user zuul 192.168.25.12 port 54812
Dec 05 05:26:55 np0005546356 sshd-session[4369]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:26:55 np0005546356 systemd-logind[745]: Session 1 logged out. Waiting for processes to exit.
Dec 05 05:28:45 np0005546356 systemd[4373]: Created slice User Background Tasks Slice.
Dec 05 05:28:45 np0005546356 systemd[4373]: Starting Cleanup of User's Temporary Files and Directories...
Dec 05 05:28:45 np0005546356 systemd[4373]: Finished Cleanup of User's Temporary Files and Directories.
Dec 05 05:31:17 np0005546356 sshd-session[7536]: Accepted publickey for zuul from 192.168.25.12 port 47678 ssh2: RSA SHA256:hSuvLjQAeOLYPNIQksEiYf02vFOP4kLq7/I3UZnNO3s
Dec 05 05:31:17 np0005546356 systemd-logind[745]: New session 3 of user zuul.
Dec 05 05:31:17 np0005546356 systemd[1]: Started Session 3 of User zuul.
Dec 05 05:31:17 np0005546356 sshd-session[7536]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:31:17 np0005546356 sudo[7563]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtvrflbxjsdpdktmyiclrtjnopghrxzm ; /usr/bin/python3'
Dec 05 05:31:17 np0005546356 sudo[7563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:17 np0005546356 python3[7565]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                             _uses_shell=True zuul_log_id=fa163e6f-3cad-1ff7-b35c-000000001cd1-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:17 np0005546356 sudo[7563]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:17 np0005546356 sudo[7592]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypuofwnmajtkdsrslxgarbdjosjnikau ; /usr/bin/python3'
Dec 05 05:31:17 np0005546356 sudo[7592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:17 np0005546356 python3[7594]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:17 np0005546356 sudo[7592]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:17 np0005546356 sudo[7618]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaygfduofgkdghjhsooanqymmqqbeqoz ; /usr/bin/python3'
Dec 05 05:31:17 np0005546356 sudo[7618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:17 np0005546356 python3[7620]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:17 np0005546356 sudo[7618]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:18 np0005546356 sudo[7644]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqardippszumbkjhkwbersknudyvtvjf ; /usr/bin/python3'
Dec 05 05:31:18 np0005546356 sudo[7644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:18 np0005546356 python3[7646]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:18 np0005546356 sudo[7644]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:18 np0005546356 sudo[7670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvcftukytyvhnswwknjshsnqwrzhuwl ; /usr/bin/python3'
Dec 05 05:31:18 np0005546356 sudo[7670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:18 np0005546356 python3[7672]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:18 np0005546356 sudo[7670]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:18 np0005546356 sudo[7696]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fewayaolwiykdevolatyeffgimxoezin ; /usr/bin/python3'
Dec 05 05:31:18 np0005546356 sudo[7696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:18 np0005546356 python3[7698]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:18 np0005546356 sudo[7696]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:19 np0005546356 sudo[7774]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfoxgyaalkwzkmnrfrwvxpfrcchjqehu ; /usr/bin/python3'
Dec 05 05:31:19 np0005546356 sudo[7774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:19 np0005546356 python3[7776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:31:19 np0005546356 sudo[7774]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:19 np0005546356 sudo[7847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwiztiuprxououbmkhnwfiiwkpyigpjg ; /usr/bin/python3'
Dec 05 05:31:19 np0005546356 sudo[7847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:19 np0005546356 python3[7849]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764912679.0410645-490-198936487289104/source _original_basename=tmp51rwyl50 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:31:19 np0005546356 sudo[7847]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:20 np0005546356 sudo[7897]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkdligyhibbemhlnrpwlxfkgwfuyksqr ; /usr/bin/python3'
Dec 05 05:31:20 np0005546356 sudo[7897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:20 np0005546356 python3[7899]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 05:31:20 np0005546356 systemd[1]: Reloading.
Dec 05 05:31:20 np0005546356 systemd-rc-local-generator[7920]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:31:20 np0005546356 sudo[7897]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:21 np0005546356 sudo[7953]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awrpfjjzmakclababwfbpyovhtwkiimi ; /usr/bin/python3'
Dec 05 05:31:21 np0005546356 sudo[7953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:21 np0005546356 python3[7955]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 05 05:31:21 np0005546356 sudo[7953]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:21 np0005546356 sudo[7979]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oltsxxkdjuihyvqucwxdqtxcthemjaju ; /usr/bin/python3'
Dec 05 05:31:21 np0005546356 sudo[7979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:21 np0005546356 python3[7981]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:21 np0005546356 sudo[7979]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:21 np0005546356 sudo[8007]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwhwnqrabfhiogowhqmfbtwcgtahjaww ; /usr/bin/python3'
Dec 05 05:31:21 np0005546356 sudo[8007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:21 np0005546356 python3[8009]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:22 np0005546356 sudo[8007]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:22 np0005546356 sudo[8035]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aorogusoxgdvhqvaqonrtjgsvyxvascy ; /usr/bin/python3'
Dec 05 05:31:22 np0005546356 sudo[8035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:22 np0005546356 python3[8037]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:22 np0005546356 sudo[8035]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:22 np0005546356 sudo[8063]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scaviuvxuvcgufwiasoaaqujbpfqolkg ; /usr/bin/python3'
Dec 05 05:31:22 np0005546356 sudo[8063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:22 np0005546356 python3[8065]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                             _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:22 np0005546356 sudo[8063]: pam_unix(sudo:session): session closed for user root
Dec 05 05:31:22 np0005546356 python3[8092]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                             _uses_shell=True zuul_log_id=fa163e6f-3cad-1ff7-b35c-000000001cd8-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:31:23 np0005546356 python3[8122]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 05 05:31:25 np0005546356 sshd-session[7539]: Connection closed by 192.168.25.12 port 47678
Dec 05 05:31:25 np0005546356 sshd-session[7536]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:31:25 np0005546356 systemd[1]: session-3.scope: Deactivated successfully.
Dec 05 05:31:25 np0005546356 systemd[1]: session-3.scope: Consumed 2.871s CPU time.
Dec 05 05:31:25 np0005546356 systemd-logind[745]: Session 3 logged out. Waiting for processes to exit.
Dec 05 05:31:25 np0005546356 systemd-logind[745]: Removed session 3.
Dec 05 05:31:27 np0005546356 sshd-session[8129]: Accepted publickey for zuul from 192.168.25.12 port 42098 ssh2: RSA SHA256:hSuvLjQAeOLYPNIQksEiYf02vFOP4kLq7/I3UZnNO3s
Dec 05 05:31:27 np0005546356 systemd-logind[745]: New session 4 of user zuul.
Dec 05 05:31:27 np0005546356 systemd[1]: Started Session 4 of User zuul.
Dec 05 05:31:27 np0005546356 sshd-session[8129]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:31:27 np0005546356 sudo[8156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoeaalwceffdnybisagyefievkunnoih ; /usr/bin/python3'
Dec 05 05:31:27 np0005546356 sudo[8156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:31:27 np0005546356 python3[8158]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 05 05:31:42 np0005546356 kernel: SELinux:  Converting 386 SID table entries...
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:31:42 np0005546356 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  Converting 386 SID table entries...
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:31:48 np0005546356 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  Converting 386 SID table entries...
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:31:55 np0005546356 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:31:55 np0005546356 setsebool[8224]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 05 05:31:55 np0005546356 setsebool[8224]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 05 05:32:03 np0005546356 kernel: SELinux:  Converting 389 SID table entries...
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:32:03 np0005546356 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:32:16 np0005546356 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 05:32:16 np0005546356 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:32:16 np0005546356 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:32:16 np0005546356 systemd[1]: Reloading.
Dec 05 05:32:16 np0005546356 systemd-rc-local-generator[8972]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:32:16 np0005546356 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:32:17 np0005546356 sudo[8156]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:30 np0005546356 python3[21927]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                              _uses_shell=True zuul_log_id=fa163e6f-3cad-f249-91e8-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:32:31 np0005546356 kernel: evm: overlay not supported
Dec 05 05:32:31 np0005546356 systemd[4373]: Starting D-Bus User Message Bus...
Dec 05 05:32:31 np0005546356 dbus-broker-launch[22779]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 05 05:32:31 np0005546356 dbus-broker-launch[22779]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 05 05:32:31 np0005546356 systemd[4373]: Started D-Bus User Message Bus.
Dec 05 05:32:31 np0005546356 dbus-broker-lau[22779]: Ready
Dec 05 05:32:31 np0005546356 systemd[4373]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 05 05:32:31 np0005546356 systemd[4373]: Created slice Slice /user.
Dec 05 05:32:31 np0005546356 systemd[4373]: podman-22700.scope: unit configures an IP firewall, but not running as root.
Dec 05 05:32:31 np0005546356 systemd[4373]: (This warning is only shown for the first unit using IP firewalling.)
Dec 05 05:32:31 np0005546356 systemd[4373]: Started podman-22700.scope.
Dec 05 05:32:32 np0005546356 systemd[4373]: Started podman-pause-17c014c9.scope.
Dec 05 05:32:32 np0005546356 sshd-session[8132]: Connection closed by 192.168.25.12 port 42098
Dec 05 05:32:32 np0005546356 sshd-session[8129]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:32:32 np0005546356 systemd[1]: session-4.scope: Deactivated successfully.
Dec 05 05:32:32 np0005546356 systemd[1]: session-4.scope: Consumed 43.013s CPU time.
Dec 05 05:32:32 np0005546356 systemd-logind[745]: Session 4 logged out. Waiting for processes to exit.
Dec 05 05:32:32 np0005546356 systemd-logind[745]: Removed session 4.
Dec 05 05:32:39 np0005546356 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:32:39 np0005546356 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:32:39 np0005546356 systemd[1]: man-db-cache-update.service: Consumed 28.377s CPU time.
Dec 05 05:32:39 np0005546356 systemd[1]: run-ra47063e637e04b0c84f34f7a698c0bbe.service: Deactivated successfully.
Dec 05 05:32:47 np0005546356 sshd-session[29665]: Connection closed by 192.168.25.171 port 48886 [preauth]
Dec 05 05:32:47 np0005546356 sshd-session[29666]: Connection closed by 192.168.25.171 port 48888 [preauth]
Dec 05 05:32:47 np0005546356 sshd-session[29667]: Unable to negotiate with 192.168.25.171 port 48904: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 05 05:32:47 np0005546356 sshd-session[29668]: Unable to negotiate with 192.168.25.171 port 48918: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 05 05:32:47 np0005546356 sshd-session[29669]: Unable to negotiate with 192.168.25.171 port 48928: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 05 05:32:57 np0005546356 sshd-session[29675]: Accepted publickey for zuul from 192.168.25.12 port 37548 ssh2: RSA SHA256:hSuvLjQAeOLYPNIQksEiYf02vFOP4kLq7/I3UZnNO3s
Dec 05 05:32:57 np0005546356 systemd-logind[745]: New session 5 of user zuul.
Dec 05 05:32:57 np0005546356 systemd[1]: Started Session 5 of User zuul.
Dec 05 05:32:57 np0005546356 sshd-session[29675]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:32:57 np0005546356 python3[29702]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMrxfN6g1BOZJOVu3SdDVzkpa37Jy+/62ZBeAPdVbM4G+vvr5/BUlF5hFGI9STmXJxgN5iTMrMh8gC+s+xUgvbs= zuul@np0005546355
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:32:57 np0005546356 sudo[29726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbehairdrffeilcdcmihyfjvldusxthw ; /usr/bin/python3'
Dec 05 05:32:57 np0005546356 sudo[29726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:32:57 np0005546356 python3[29728]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMrxfN6g1BOZJOVu3SdDVzkpa37Jy+/62ZBeAPdVbM4G+vvr5/BUlF5hFGI9STmXJxgN5iTMrMh8gC+s+xUgvbs= zuul@np0005546355
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:32:57 np0005546356 sudo[29726]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:58 np0005546356 sudo[29752]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyyodoahwperjwzlmtipbvcgbdrecklh ; /usr/bin/python3'
Dec 05 05:32:58 np0005546356 sudo[29752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:32:58 np0005546356 python3[29754]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005546356 update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 05 05:32:58 np0005546356 useradd[29756]: new group: name=cloud-admin, GID=1002
Dec 05 05:32:58 np0005546356 useradd[29756]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 05 05:32:58 np0005546356 sudo[29752]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:58 np0005546356 sudo[29786]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htxhvsayhxfnhsmgfkvlnatnrxnhwtxw ; /usr/bin/python3'
Dec 05 05:32:58 np0005546356 sudo[29786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:32:58 np0005546356 python3[29788]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMrxfN6g1BOZJOVu3SdDVzkpa37Jy+/62ZBeAPdVbM4G+vvr5/BUlF5hFGI9STmXJxgN5iTMrMh8gC+s+xUgvbs= zuul@np0005546355
                                              manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 05 05:32:58 np0005546356 sudo[29786]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:59 np0005546356 sudo[29864]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdjfrutyckvxnkenszmubxoxoiuavbwm ; /usr/bin/python3'
Dec 05 05:32:59 np0005546356 sudo[29864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:32:59 np0005546356 python3[29866]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:32:59 np0005546356 sudo[29864]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:59 np0005546356 sudo[29937]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yecbseygcwbblledmlxwgcveeylhknkj ; /usr/bin/python3'
Dec 05 05:32:59 np0005546356 sudo[29937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:32:59 np0005546356 python3[29939]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764912778.9369326-136-235855255018043/source _original_basename=tmp_5kkqbmi follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:32:59 np0005546356 sudo[29937]: pam_unix(sudo:session): session closed for user root
Dec 05 05:32:59 np0005546356 sudo[29987]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvxpecxupaphotcjxrmxuhgcwglvgthy ; /usr/bin/python3'
Dec 05 05:32:59 np0005546356 sudo[29987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:33:00 np0005546356 python3[29989]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 05 05:33:00 np0005546356 systemd[1]: Starting Hostname Service...
Dec 05 05:33:00 np0005546356 systemd[1]: Started Hostname Service.
Dec 05 05:33:00 np0005546356 systemd-hostnamed[29993]: Changed pretty hostname to 'compute-0'
Dec 05 05:33:00 compute-0 systemd-hostnamed[29993]: Hostname set to <compute-0> (static)
Dec 05 05:33:00 compute-0 NetworkManager[7250]: <info>  [1764912780.1645] hostname: static hostname changed from "np0005546356" to "compute-0"
Dec 05 05:33:00 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:33:00 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:33:00 compute-0 sudo[29987]: pam_unix(sudo:session): session closed for user root
Dec 05 05:33:00 compute-0 sshd-session[29678]: Connection closed by 192.168.25.12 port 37548
Dec 05 05:33:00 compute-0 sshd-session[29675]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:33:00 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Dec 05 05:33:00 compute-0 systemd[1]: session-5.scope: Consumed 1.612s CPU time.
Dec 05 05:33:00 compute-0 systemd-logind[745]: Session 5 logged out. Waiting for processes to exit.
Dec 05 05:33:00 compute-0 systemd-logind[745]: Removed session 5.
Dec 05 05:33:01 compute-0 chronyd[752]: Selected source 23.186.168.132 (2.centos.pool.ntp.org)
Dec 05 05:33:10 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:33:30 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 05:36:14 compute-0 sshd-session[30013]: Accepted publickey for zuul from 192.168.25.171 port 54626 ssh2: RSA SHA256:hSuvLjQAeOLYPNIQksEiYf02vFOP4kLq7/I3UZnNO3s
Dec 05 05:36:14 compute-0 systemd-logind[745]: New session 6 of user zuul.
Dec 05 05:36:14 compute-0 systemd[1]: Started Session 6 of User zuul.
Dec 05 05:36:14 compute-0 sshd-session[30013]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:36:14 compute-0 python3[30089]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:36:15 compute-0 sudo[30199]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkjsylvjumayucqhmajodiirnhsvorzc ; /usr/bin/python3'
Dec 05 05:36:15 compute-0 sudo[30199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:15 compute-0 python3[30201]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:15 compute-0 sudo[30199]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:15 compute-0 sudo[30272]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsbezvcvyrygfkxotsefhxwcaevlhqqx ; /usr/bin/python3'
Dec 05 05:36:15 compute-0 sudo[30272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:16 compute-0 python3[30274]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=delorean.repo follow=False checksum=4747551c791b9496c007f54094a6db3e9e669b46 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:16 compute-0 sudo[30272]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:16 compute-0 sudo[30298]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlgzlxqwdbsrpqfxevuhwaxgxtojvtmi ; /usr/bin/python3'
Dec 05 05:36:16 compute-0 sudo[30298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:16 compute-0 python3[30300]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:16 compute-0 sudo[30298]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:16 compute-0 sudo[30371]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gomflexxcaisuvwtkgimrjgqxtrzgaay ; /usr/bin/python3'
Dec 05 05:36:16 compute-0 sudo[30371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:16 compute-0 irqbalance[741]: Cannot change IRQ 43 affinity: Operation not permitted
Dec 05 05:36:16 compute-0 python3[30373]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=6bd8a06d03894882d0569b1ebdede8293e9657fa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:16 compute-0 irqbalance[741]: IRQ 43 affinity is now unmanaged
Dec 05 05:36:16 compute-0 sudo[30371]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:16 compute-0 sudo[30397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sylsvaxuveczgllhxjujnlswgiwlkxpx ; /usr/bin/python3'
Dec 05 05:36:16 compute-0 sudo[30397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:16 compute-0 python3[30399]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:16 compute-0 sudo[30397]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:16 compute-0 sudo[30470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywypsjqadlmagcurffnpqahyyfbpmpg ; /usr/bin/python3'
Dec 05 05:36:16 compute-0 sudo[30470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:16 compute-0 python3[30472]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=5c739387d960f7119f9d22475c90dcd56f13e885 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:16 compute-0 sudo[30470]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:16 compute-0 sudo[30496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onohjfualumsjbodaevwdbgvijyluwoz ; /usr/bin/python3'
Dec 05 05:36:16 compute-0 sudo[30496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:17 compute-0 python3[30498]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:17 compute-0 sudo[30496]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:17 compute-0 sudo[30569]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnbepgxbmvblrxkerqcfrvwlkspmisfx ; /usr/bin/python3'
Dec 05 05:36:17 compute-0 sudo[30569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:17 compute-0 python3[30571]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=8c00581855ef07972e002c82cc33b7b03ecccc44 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:17 compute-0 sudo[30569]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:17 compute-0 sudo[30595]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzqguebnuafynshglskzfjwhklzvzlxb ; /usr/bin/python3'
Dec 05 05:36:17 compute-0 sudo[30595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:17 compute-0 python3[30597]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:17 compute-0 sudo[30595]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:17 compute-0 sudo[30668]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twpvnwpfoyonuglxwgbfdiuzagqfsuuu ; /usr/bin/python3'
Dec 05 05:36:17 compute-0 sudo[30668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:17 compute-0 python3[30670]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=5515871802d2268513e691cf460c59c7da7132f9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:17 compute-0 sudo[30668]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:17 compute-0 sudo[30694]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esyfmybsloqdkhvrzutqnivxozvqayim ; /usr/bin/python3'
Dec 05 05:36:17 compute-0 sudo[30694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:17 compute-0 python3[30696]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:17 compute-0 sudo[30694]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:17 compute-0 sudo[30767]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxgyxuefqwrwalrkmlczavnrbdovqbeb ; /usr/bin/python3'
Dec 05 05:36:17 compute-0 sudo[30767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:18 compute-0 python3[30769]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=c87c0371a768c46886c8904021e8b85df789a625 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:18 compute-0 sudo[30767]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:18 compute-0 sudo[30793]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdzuklvuifbzyghksplkcoxnwrrqmus ; /usr/bin/python3'
Dec 05 05:36:18 compute-0 sudo[30793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:18 compute-0 python3[30795]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 05 05:36:18 compute-0 sudo[30793]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:18 compute-0 sudo[30866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usdcdhvwmzotiwqqgisojmaaerwatlim ; /usr/bin/python3'
Dec 05 05:36:18 compute-0 sudo[30866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:36:18 compute-0 python3[30868]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764912975.5054152-34207-124728744225221/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=fa2c662325f345c065cf09a4d87ff5b21ab5eb35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:36:18 compute-0 sudo[30866]: pam_unix(sudo:session): session closed for user root
Dec 05 05:36:20 compute-0 sshd-session[30893]: Connection closed by 192.168.122.11 port 34600 [preauth]
Dec 05 05:36:20 compute-0 sshd-session[30894]: Connection closed by 192.168.122.11 port 34604 [preauth]
Dec 05 05:36:20 compute-0 sshd-session[30895]: Unable to negotiate with 192.168.122.11 port 34606: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 05 05:36:20 compute-0 sshd-session[30896]: Unable to negotiate with 192.168.122.11 port 34608: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 05 05:36:20 compute-0 sshd-session[30897]: Unable to negotiate with 192.168.122.11 port 34614: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 05 05:37:26 compute-0 python3[30926]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:37:26 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 05 05:37:26 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 05 05:37:26 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 05 05:37:26 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 05 05:41:45 compute-0 systemd[1]: Starting dnf makecache...
Dec 05 05:41:45 compute-0 dnf[30931]: Failed determining last makecache time.
Dec 05 05:41:45 compute-0 dnf[30931]: delorean-python-castellan-609f4ea667df386849930  80 kB/s |  13 kB     00:00
Dec 05 05:41:45 compute-0 dnf[30931]: delorean-openstack-ironic-c525a16b06266b6b474c9 406 kB/s |  64 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-openstack-cinder-92c645f1f1e913b5b1cd8 213 kB/s |  30 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-ansible-collections-openstack-f584c54d 766 kB/s | 121 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-openstack-ceilometer-60803e710e7f5b3cd 152 kB/s |  24 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-openstack-kolla-e7bd46dad0b62ff151667b 1.7 MB/s | 274 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-openstack-nova-3e7017eb2952d5258d96e27 250 kB/s |  37 kB     00:00
Dec 05 05:41:46 compute-0 dnf[30931]: delorean-openstack-designate-82652559ea8641b11c 136 kB/s |  19 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-openstack-glance-e055873be4079bc9d3716 131 kB/s |  19 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-openstack-keystone-4f1b7e96e38463d5fcd 160 kB/s |  23 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-openstack-manila-70623bb84e7880f7f2f75 179 kB/s |  27 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-python-networking-mlnx-7139a7f0bce9d6a 898 kB/s | 130 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-openstack-octavia-e981d3e172b8e4471f97 171 kB/s |  25 kB     00:00
Dec 05 05:41:47 compute-0 dnf[30931]: delorean-openstack-watcher-71470dac73abba9e5dcf 122 kB/s |  17 kB     00:00
Dec 05 05:41:48 compute-0 dnf[30931]: delorean-ansible-config_template-5ccaa22121a7ff  54 kB/s | 7.9 kB     00:00
Dec 05 05:41:48 compute-0 dnf[30931]: delorean-puppet-magnum-ec92e647ad5e77720f01cce0 1.0 MB/s | 155 kB     00:00
Dec 05 05:41:48 compute-0 dnf[30931]: delorean-openstack-swift-e10c2bafcb8fc80929bce3  98 kB/s |  15 kB     00:00
Dec 05 05:41:48 compute-0 dnf[30931]: delorean-python-mistral-tests-tempest-900580c95 225 kB/s |  35 kB     00:00
Dec 05 05:41:48 compute-0 dnf[30931]: delorean-python-django-horizon-915b939b342dc65f 725 kB/s | 105 kB     00:00
Dec 05 05:41:50 compute-0 dnf[30931]: CentOS Stream 9 - BaseOS                        4.8 kB/s | 7.3 kB     00:01
Dec 05 05:41:51 compute-0 dnf[30931]: CentOS Stream 9 - AppStream                     9.3 kB/s | 7.4 kB     00:00
Dec 05 05:41:51 compute-0 dnf[30931]: CentOS Stream 9 - CRB                           9.5 kB/s | 7.2 kB     00:00
Dec 05 05:41:52 compute-0 dnf[30931]: CentOS Stream 9 - Extras packages                17 kB/s | 8.3 kB     00:00
Dec 05 05:41:52 compute-0 dnf[30931]: dlrn-master-testing                              16 MB/s | 2.4 MB     00:00
Dec 05 05:41:53 compute-0 dnf[30931]: dlrn-master-build-deps                          3.4 MB/s | 516 kB     00:00
Dec 05 05:41:53 compute-0 dnf[30931]: centos9-rabbitmq                                3.0 MB/s | 123 kB     00:00
Dec 05 05:41:53 compute-0 dnf[30931]: centos9-storage                                  30 MB/s | 415 kB     00:00
Dec 05 05:41:53 compute-0 dnf[30931]: centos9-opstools                                4.2 MB/s |  51 kB     00:00
Dec 05 05:41:53 compute-0 dnf[30931]: NFV SIG OpenvSwitch                              34 MB/s | 456 kB     00:00
Dec 05 05:41:54 compute-0 dnf[30931]: repo-setup-centos-appstream                     274 MB/s |  25 MB     00:00
Dec 05 05:41:58 compute-0 dnf[30931]: repo-setup-centos-baseos                        213 MB/s | 8.8 MB     00:00
Dec 05 05:41:59 compute-0 dnf[30931]: repo-setup-centos-highavailability               53 MB/s | 744 kB     00:00
Dec 05 05:41:59 compute-0 dnf[30931]: repo-setup-centos-powertools                    202 MB/s | 7.3 MB     00:00
Dec 05 05:42:12 compute-0 dnf[30931]: Extra Packages for Enterprise Linux 9 - x86_64  1.8 MB/s |  20 MB     00:11
Dec 05 05:42:22 compute-0 dnf[30931]: Metadata cache created.
Dec 05 05:42:22 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 05 05:42:22 compute-0 systemd[1]: Finished dnf makecache.
Dec 05 05:42:22 compute-0 systemd[1]: dnf-makecache.service: Consumed 19.046s CPU time.
Dec 05 05:42:25 compute-0 sshd-session[30016]: Received disconnect from 192.168.25.171 port 54626:11: disconnected by user
Dec 05 05:42:25 compute-0 sshd-session[30016]: Disconnected from user zuul 192.168.25.171 port 54626
Dec 05 05:42:25 compute-0 sshd-session[30013]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:42:25 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 05 05:42:25 compute-0 systemd[1]: session-6.scope: Consumed 3.327s CPU time.
Dec 05 05:42:25 compute-0 systemd-logind[745]: Session 6 logged out. Waiting for processes to exit.
Dec 05 05:42:25 compute-0 systemd-logind[745]: Removed session 6.
Dec 05 05:48:38 compute-0 sshd-session[31035]: Accepted publickey for zuul from 192.168.122.30 port 42026 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:48:38 compute-0 systemd-logind[745]: New session 7 of user zuul.
Dec 05 05:48:38 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 05 05:48:38 compute-0 sshd-session[31035]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:48:39 compute-0 python3.9[31188]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:48:39 compute-0 sudo[31367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euuexdntdnocwrfhjpygefpvxbeyzkur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913719.7140214-44-30391149750475/AnsiballZ_command.py'
Dec 05 05:48:39 compute-0 sudo[31367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:48:40 compute-0 python3.9[31369]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:48:48 compute-0 sudo[31367]: pam_unix(sudo:session): session closed for user root
Dec 05 05:48:48 compute-0 sshd-session[31038]: Connection closed by 192.168.122.30 port 42026
Dec 05 05:48:48 compute-0 sshd-session[31035]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:48:48 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 05 05:48:48 compute-0 systemd[1]: session-7.scope: Consumed 6.072s CPU time.
Dec 05 05:48:48 compute-0 systemd-logind[745]: Session 7 logged out. Waiting for processes to exit.
Dec 05 05:48:48 compute-0 systemd-logind[745]: Removed session 7.
Dec 05 05:48:53 compute-0 sshd-session[31427]: Accepted publickey for zuul from 192.168.122.30 port 42772 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:48:53 compute-0 systemd-logind[745]: New session 8 of user zuul.
Dec 05 05:48:53 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 05 05:48:53 compute-0 sshd-session[31427]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:48:54 compute-0 python3.9[31580]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:48:54 compute-0 sshd-session[31430]: Connection closed by 192.168.122.30 port 42772
Dec 05 05:48:54 compute-0 sshd-session[31427]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:48:54 compute-0 systemd-logind[745]: Session 8 logged out. Waiting for processes to exit.
Dec 05 05:48:54 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 05 05:48:54 compute-0 systemd-logind[745]: Removed session 8.
Dec 05 05:49:08 compute-0 chronyd[752]: Selected source 172.235.32.243 (2.centos.pool.ntp.org)
Dec 05 05:49:09 compute-0 sshd-session[31608]: Accepted publickey for zuul from 192.168.122.30 port 32908 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:49:09 compute-0 systemd-logind[745]: New session 9 of user zuul.
Dec 05 05:49:09 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 05 05:49:09 compute-0 sshd-session[31608]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:49:10 compute-0 python3.9[31761]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 05 05:49:10 compute-0 python3.9[31935]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:49:11 compute-0 sudo[32085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsjsyyqycxjlyqxajkvlhgzdlfckaxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913751.1995215-69-156748296162284/AnsiballZ_command.py'
Dec 05 05:49:11 compute-0 sudo[32085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:11 compute-0 python3.9[32087]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:49:11 compute-0 sudo[32085]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:12 compute-0 sudo[32238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diwpuajtpqjpwurgtksojqvejzocjcjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913751.887675-93-73183861942560/AnsiballZ_stat.py'
Dec 05 05:49:12 compute-0 sudo[32238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:12 compute-0 python3.9[32240]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:49:12 compute-0 sudo[32238]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:12 compute-0 sudo[32390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddexrgrpxlypzeouewbulscbxozchpdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913752.4406104-109-251863661033796/AnsiballZ_file.py'
Dec 05 05:49:12 compute-0 sudo[32390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:12 compute-0 python3.9[32392]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:49:12 compute-0 sudo[32390]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:13 compute-0 sudo[32542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyapkexttrjsjdkqlqznhorptefrsfuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913753.017625-125-256915075142987/AnsiballZ_stat.py'
Dec 05 05:49:13 compute-0 sudo[32542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:13 compute-0 python3.9[32544]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:49:13 compute-0 sudo[32542]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:13 compute-0 sudo[32665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adxplfhucawsbmewdddrufppbaacruju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913753.017625-125-256915075142987/AnsiballZ_copy.py'
Dec 05 05:49:13 compute-0 sudo[32665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:13 compute-0 python3.9[32667]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764913753.017625-125-256915075142987/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:49:13 compute-0 sudo[32665]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:14 compute-0 sudo[32817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxwnpbkbqffdqtfzumuevgmdnsolcsds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913753.916221-155-206506335508128/AnsiballZ_setup.py'
Dec 05 05:49:14 compute-0 sudo[32817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:14 compute-0 python3.9[32819]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:49:14 compute-0 sudo[32817]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:14 compute-0 sudo[32973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iemftwgyrsvmqtkhxrdbmrmfgdxheswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913754.6100583-171-221464895387387/AnsiballZ_file.py'
Dec 05 05:49:14 compute-0 sudo[32973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:14 compute-0 python3.9[32975]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:49:14 compute-0 sudo[32973]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:15 compute-0 sudo[33125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdihymgfdnhdtrxfkbokuityktnrhevr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913755.0892227-189-22316505383053/AnsiballZ_file.py'
Dec 05 05:49:15 compute-0 sudo[33125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:15 compute-0 python3.9[33127]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:49:15 compute-0 sudo[33125]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:15 compute-0 python3.9[33277]: ansible-ansible.builtin.service_facts Invoked
Dec 05 05:49:20 compute-0 python3.9[33530]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:49:20 compute-0 python3.9[33680]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:49:21 compute-0 python3.9[33834]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:49:22 compute-0 sudo[33990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxyesmzzwxbswtapvlnkfbgegutbzwni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913761.9701269-285-96578610361342/AnsiballZ_setup.py'
Dec 05 05:49:22 compute-0 sudo[33990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:22 compute-0 python3.9[33992]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:49:22 compute-0 sudo[33990]: pam_unix(sudo:session): session closed for user root
Dec 05 05:49:22 compute-0 sudo[34074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkoqflqqtlruhmosjlxvwhyvzwjelpwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913761.9701269-285-96578610361342/AnsiballZ_dnf.py'
Dec 05 05:49:22 compute-0 sudo[34074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:49:23 compute-0 python3.9[34076]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:50:46 compute-0 systemd[1]: Reloading.
Dec 05 05:50:46 compute-0 systemd-rc-local-generator[34276]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:50:46 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 05 05:50:46 compute-0 systemd[1]: Reloading.
Dec 05 05:50:46 compute-0 systemd-rc-local-generator[34317]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:50:46 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 05 05:50:46 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 05 05:50:46 compute-0 systemd[1]: Reloading.
Dec 05 05:50:46 compute-0 systemd-rc-local-generator[34358]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:50:46 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 05 05:50:46 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 05:50:46 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 05:50:46 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 05:51:30 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:51:30 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:51:30 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 05 05:51:30 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:51:30 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:51:30 compute-0 systemd[1]: Reloading.
Dec 05 05:51:30 compute-0 systemd-rc-local-generator[34663]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:51:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:51:31 compute-0 sudo[34074]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:51:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:51:31 compute-0 systemd[1]: run-r1db142e1ac0b4e73b0c7441b2164edf9.service: Deactivated successfully.
Dec 05 05:51:31 compute-0 sudo[35576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiloyhttlopbmubzkrnwmwnmajjjubtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913891.1923347-309-193412144508017/AnsiballZ_command.py'
Dec 05 05:51:31 compute-0 sudo[35576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:31 compute-0 python3.9[35578]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:51:32 compute-0 sudo[35576]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:32 compute-0 sudo[35857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgyiigdvqsszjeuwsjwjmvdfcfyliazz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913892.3586683-325-212457241812917/AnsiballZ_selinux.py'
Dec 05 05:51:32 compute-0 sudo[35857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:33 compute-0 python3.9[35859]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 05 05:51:33 compute-0 sudo[35857]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:33 compute-0 sudo[36009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvysobdsupwveeasdwtbrqhbiesebbyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913893.30641-347-227253084554534/AnsiballZ_command.py'
Dec 05 05:51:33 compute-0 sudo[36009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:33 compute-0 python3.9[36011]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 05 05:51:34 compute-0 sudo[36009]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:34 compute-0 sudo[36162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkswrokmusqoqnetzbggauplqrrwkwto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913894.483096-363-54944856985306/AnsiballZ_file.py'
Dec 05 05:51:34 compute-0 sudo[36162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:35 compute-0 python3.9[36164]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:51:35 compute-0 sudo[36162]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:35 compute-0 sudo[36314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odwdhbquhhvmmaagombtszqvzgqfwibe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913895.5103304-379-278350234712130/AnsiballZ_mount.py'
Dec 05 05:51:35 compute-0 sudo[36314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:35 compute-0 python3.9[36316]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 05 05:51:36 compute-0 sudo[36314]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:36 compute-0 sudo[36466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jislutaqahqnziqjznnmjlpqufttciaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913896.6172671-435-165572440285086/AnsiballZ_file.py'
Dec 05 05:51:36 compute-0 sudo[36466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:36 compute-0 python3.9[36468]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:51:36 compute-0 sudo[36466]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:37 compute-0 sudo[36618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejlzgljytgxrckqjlwrtcprgysmwsavo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913897.1231844-451-67481930088437/AnsiballZ_stat.py'
Dec 05 05:51:37 compute-0 sudo[36618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:37 compute-0 python3.9[36620]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:51:37 compute-0 sudo[36618]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:37 compute-0 sudo[36741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vptptzxavnenlykqjhuczninhdjhufgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913897.1231844-451-67481930088437/AnsiballZ_copy.py'
Dec 05 05:51:37 compute-0 sudo[36741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:37 compute-0 python3.9[36743]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764913897.1231844-451-67481930088437/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:51:37 compute-0 sudo[36741]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:38 compute-0 sudo[36893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qugxwqikbjjzysmvnrowxccnboobmezh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913898.3373318-499-246662322106451/AnsiballZ_stat.py'
Dec 05 05:51:38 compute-0 sudo[36893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:38 compute-0 python3.9[36895]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:51:38 compute-0 sudo[36893]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:38 compute-0 sudo[37045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyeqjlfvhdvclfwzvammvehotbprakbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913898.8206532-515-273942043597056/AnsiballZ_command.py'
Dec 05 05:51:38 compute-0 sudo[37045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:39 compute-0 python3.9[37047]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:51:39 compute-0 sudo[37045]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:39 compute-0 sudo[37198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvlprztqmwhjrimoyzygayqojnzrccvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913899.438287-531-125928511818316/AnsiballZ_file.py'
Dec 05 05:51:39 compute-0 sudo[37198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:39 compute-0 python3.9[37200]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:51:39 compute-0 sudo[37198]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:41 compute-0 sudo[37350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmzubmrdolazizaturuuvcvkqfcixjoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913901.6106076-553-178064443738732/AnsiballZ_getent.py'
Dec 05 05:51:41 compute-0 sudo[37350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:42 compute-0 python3.9[37352]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 05 05:51:42 compute-0 sudo[37350]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:42 compute-0 sudo[37503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxopovscltjvjukwamlgjhgnmweegdcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913902.6563594-569-200160953724079/AnsiballZ_group.py'
Dec 05 05:51:42 compute-0 sudo[37503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:45 compute-0 python3.9[37505]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 05:51:45 compute-0 groupadd[37506]: group added to /etc/group: name=qemu, GID=107
Dec 05 05:51:45 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 05:51:45 compute-0 groupadd[37506]: group added to /etc/gshadow: name=qemu
Dec 05 05:51:45 compute-0 groupadd[37506]: new group: name=qemu, GID=107
Dec 05 05:51:45 compute-0 sudo[37503]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:46 compute-0 sudo[37662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwfivjzziitlvckhczkewryawcjfuwjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913905.732039-585-105487983822956/AnsiballZ_user.py'
Dec 05 05:51:46 compute-0 sudo[37662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:46 compute-0 python3.9[37664]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 05:51:46 compute-0 useradd[37666]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 05:51:46 compute-0 sudo[37662]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:46 compute-0 sudo[37822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgxlagfrgvfyxnvxsbnjsukdervnqhky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913906.3934047-601-16666513507872/AnsiballZ_getent.py'
Dec 05 05:51:46 compute-0 sudo[37822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:46 compute-0 python3.9[37824]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 05 05:51:46 compute-0 sudo[37822]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:47 compute-0 sudo[37975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqwngegxdydbeqpqtehxrnnlpjyfjfcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913906.8671522-617-219280768701278/AnsiballZ_group.py'
Dec 05 05:51:47 compute-0 sudo[37975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:47 compute-0 python3.9[37977]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 05:51:47 compute-0 groupadd[37978]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 05 05:51:47 compute-0 groupadd[37978]: group added to /etc/gshadow: name=hugetlbfs
Dec 05 05:51:47 compute-0 groupadd[37978]: new group: name=hugetlbfs, GID=42477
Dec 05 05:51:47 compute-0 sudo[37975]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:47 compute-0 sudo[38133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znqlylxxheaiqbbamxkjnkfuausogydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913907.4333-635-145929804442926/AnsiballZ_file.py'
Dec 05 05:51:47 compute-0 sudo[38133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:47 compute-0 python3.9[38135]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 05 05:51:47 compute-0 sudo[38133]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:48 compute-0 sudo[38285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqydwxxhcvawtrcsekobglmqzpkisevd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913908.0501842-657-131281547391589/AnsiballZ_dnf.py'
Dec 05 05:51:48 compute-0 sudo[38285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:48 compute-0 python3.9[38287]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:51:49 compute-0 sudo[38285]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:49 compute-0 sudo[38438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chlxnwudzlylifxpzwkpawgwzvtindfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913909.7688198-673-207658175025616/AnsiballZ_file.py'
Dec 05 05:51:49 compute-0 sudo[38438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:50 compute-0 python3.9[38440]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:51:50 compute-0 sudo[38438]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:50 compute-0 sudo[38590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzbvuwjlasljoszlpkosfwcdusvudbix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913910.2498856-689-38832845894450/AnsiballZ_stat.py'
Dec 05 05:51:50 compute-0 sudo[38590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:50 compute-0 python3.9[38592]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:51:50 compute-0 sudo[38590]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:50 compute-0 sudo[38713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbcxnysygilsolfcdknaffspcaiyxgog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913910.2498856-689-38832845894450/AnsiballZ_copy.py'
Dec 05 05:51:50 compute-0 sudo[38713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:50 compute-0 python3.9[38715]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764913910.2498856-689-38832845894450/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:51:50 compute-0 sudo[38713]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:51 compute-0 sudo[38865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psslklvuaanygvughsptsbqhlidbdplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913911.0649607-719-71980277571864/AnsiballZ_systemd.py'
Dec 05 05:51:51 compute-0 sudo[38865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:51 compute-0 python3.9[38867]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:51:51 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 05:51:51 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 05 05:51:51 compute-0 systemd-modules-load[38871]: Inserted module 'br_netfilter'
Dec 05 05:51:51 compute-0 kernel: Bridge firewalling registered
Dec 05 05:51:51 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 05:51:51 compute-0 sudo[38865]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:52 compute-0 sudo[39024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvgucvhbfsrcxylxxmdbutqyhmdpkovf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913911.9179275-735-104500208257428/AnsiballZ_stat.py'
Dec 05 05:51:52 compute-0 sudo[39024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:52 compute-0 python3.9[39026]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:51:52 compute-0 sudo[39024]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:52 compute-0 sudo[39147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioztjejmjgnqzovfaeyifhjzwfbbqsyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913911.9179275-735-104500208257428/AnsiballZ_copy.py'
Dec 05 05:51:52 compute-0 sudo[39147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:52 compute-0 python3.9[39149]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764913911.9179275-735-104500208257428/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:51:52 compute-0 sudo[39147]: pam_unix(sudo:session): session closed for user root
Dec 05 05:51:53 compute-0 sudo[39299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuxjvqgzjqitbrpgxymbnijvurenahmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913912.899676-771-12258092539861/AnsiballZ_dnf.py'
Dec 05 05:51:53 compute-0 sudo[39299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:51:53 compute-0 python3.9[39301]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:51:57 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 05:51:58 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 05:51:58 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:51:58 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:51:58 compute-0 systemd[1]: Reloading.
Dec 05 05:51:58 compute-0 systemd-rc-local-generator[39359]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:51:58 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:51:58 compute-0 sudo[39299]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:00 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:52:00 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:52:00 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.975s CPU time.
Dec 05 05:52:00 compute-0 systemd[1]: run-r5b015d9bd2b945c39e1a7173b9c1982b.service: Deactivated successfully.
Dec 05 05:52:02 compute-0 python3.9[43008]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:52:02 compute-0 python3.9[43160]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 05 05:52:03 compute-0 python3.9[43310]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:52:03 compute-0 sudo[43460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgozrasvcoreqqahidohrycvelwvvqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913923.3922033-849-103449179230857/AnsiballZ_command.py'
Dec 05 05:52:03 compute-0 sudo[43460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:03 compute-0 python3.9[43462]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:03 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 05:52:04 compute-0 systemd[1]: Starting Authorization Manager...
Dec 05 05:52:04 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 05:52:04 compute-0 polkitd[43679]: Started polkitd version 0.117
Dec 05 05:52:04 compute-0 polkitd[43679]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 05:52:04 compute-0 polkitd[43679]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 05:52:04 compute-0 polkitd[43679]: Finished loading, compiling and executing 2 rules
Dec 05 05:52:04 compute-0 polkitd[43679]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 05 05:52:04 compute-0 systemd[1]: Started Authorization Manager.
Dec 05 05:52:04 compute-0 sudo[43460]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:04 compute-0 sudo[43843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxmpbzneuwjycsjzglrwqmlohpdncjrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913924.403725-867-142062008449597/AnsiballZ_systemd.py'
Dec 05 05:52:04 compute-0 sudo[43843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:04 compute-0 python3.9[43845]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:52:04 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 05 05:52:04 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 05 05:52:04 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 05 05:52:04 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 05 05:52:05 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 05 05:52:05 compute-0 sudo[43843]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:05 compute-0 python3.9[44007]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 05 05:52:07 compute-0 sudo[44157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyoopigwaljjtprxkmvudvakerezyzoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913927.219511-981-199136317002136/AnsiballZ_systemd.py'
Dec 05 05:52:07 compute-0 sudo[44157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:07 compute-0 python3.9[44159]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:52:07 compute-0 systemd[1]: Reloading.
Dec 05 05:52:07 compute-0 systemd-rc-local-generator[44185]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:52:07 compute-0 sudo[44157]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:08 compute-0 sudo[44346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twufisgbbhxowycyvmstiwyikqezltke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913927.9053981-981-252954958878557/AnsiballZ_systemd.py'
Dec 05 05:52:08 compute-0 sudo[44346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:08 compute-0 python3.9[44348]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:52:08 compute-0 systemd[1]: Reloading.
Dec 05 05:52:08 compute-0 systemd-rc-local-generator[44370]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:52:08 compute-0 sudo[44346]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:08 compute-0 sudo[44535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onsmfoddcdqhoprjpkcltycxwipwshky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913928.7985451-1013-9695828052570/AnsiballZ_command.py'
Dec 05 05:52:08 compute-0 sudo[44535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:09 compute-0 python3.9[44537]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:09 compute-0 sudo[44535]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:09 compute-0 sudo[44688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmijmfdizhbhfbjxryvxqwabgsbswflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913929.337271-1029-90193586028827/AnsiballZ_command.py'
Dec 05 05:52:09 compute-0 sudo[44688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:09 compute-0 python3.9[44690]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:09 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 05 05:52:09 compute-0 sudo[44688]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:10 compute-0 sudo[44841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbpcwxkzhhtnlpmfxflrrsxrpbutafl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913929.8665345-1045-45058686172521/AnsiballZ_command.py'
Dec 05 05:52:10 compute-0 sudo[44841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:10 compute-0 python3.9[44843]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:11 compute-0 sudo[44841]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:11 compute-0 sudo[45003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbjcngsqnyraziptkdfabqmknumkjrsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913931.5014074-1061-261103649416702/AnsiballZ_command.py'
Dec 05 05:52:11 compute-0 sudo[45003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:11 compute-0 python3.9[45005]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:11 compute-0 sudo[45003]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:13 compute-0 sudo[45156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbzdiorgpzptyomtxvoaqecamwozpkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913933.0262902-1077-54289542790173/AnsiballZ_systemd.py'
Dec 05 05:52:13 compute-0 sudo[45156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:13 compute-0 python3.9[45158]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:52:13 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 05 05:52:13 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 05 05:52:13 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 05 05:52:13 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 05 05:52:13 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 05 05:52:13 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 05 05:52:13 compute-0 sudo[45156]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:13 compute-0 sshd-session[31611]: Connection closed by 192.168.122.30 port 32908
Dec 05 05:52:13 compute-0 sshd-session[31608]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:52:13 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 05 05:52:13 compute-0 systemd[1]: session-9.scope: Consumed 1min 37.808s CPU time.
Dec 05 05:52:13 compute-0 systemd-logind[745]: Session 9 logged out. Waiting for processes to exit.
Dec 05 05:52:13 compute-0 systemd-logind[745]: Removed session 9.
Dec 05 05:52:18 compute-0 sshd-session[45188]: Accepted publickey for zuul from 192.168.122.30 port 58556 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:52:18 compute-0 systemd-logind[745]: New session 10 of user zuul.
Dec 05 05:52:18 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 05 05:52:18 compute-0 sshd-session[45188]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:52:19 compute-0 python3.9[45341]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:52:20 compute-0 python3.9[45495]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:52:21 compute-0 sudo[45649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orhqldyzqliywsoefrtrvwsekanansuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913941.1508837-80-274877381106999/AnsiballZ_command.py'
Dec 05 05:52:21 compute-0 sudo[45649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:21 compute-0 python3.9[45651]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:21 compute-0 sudo[45649]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:22 compute-0 python3.9[45802]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:52:22 compute-0 sudo[45956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcmtovblpecvtzxwomudidadfwcmfitt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913942.6929224-120-8803889482020/AnsiballZ_setup.py'
Dec 05 05:52:22 compute-0 sudo[45956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:23 compute-0 python3.9[45958]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:52:23 compute-0 sudo[45956]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:23 compute-0 sudo[46040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyhbxufffhnkpwblilxyxbblntptavoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913942.6929224-120-8803889482020/AnsiballZ_dnf.py'
Dec 05 05:52:23 compute-0 sudo[46040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:23 compute-0 python3.9[46042]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:52:24 compute-0 sudo[46040]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:25 compute-0 sudo[46193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcjzjqiduyxhgytftqomexarbcqpcpkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913944.9821675-144-40937302923748/AnsiballZ_setup.py'
Dec 05 05:52:25 compute-0 sudo[46193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:25 compute-0 python3.9[46195]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:52:25 compute-0 sudo[46193]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:26 compute-0 sudo[46364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ranzsqeqeonpqipzqulivgraioqptvyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913945.7525103-166-15726944567859/AnsiballZ_file.py'
Dec 05 05:52:26 compute-0 sudo[46364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:26 compute-0 python3.9[46366]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:52:26 compute-0 sudo[46364]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:26 compute-0 sudo[46516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbbrfiqvxdoelefcbdbpkgccjsurrjfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913946.3490283-182-191457905786833/AnsiballZ_command.py'
Dec 05 05:52:26 compute-0 sudo[46516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:26 compute-0 python3.9[46518]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:52:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2132649333-merged.mount: Deactivated successfully.
Dec 05 05:52:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2312302999-merged.mount: Deactivated successfully.
Dec 05 05:52:26 compute-0 podman[46519]: 2025-12-05 05:52:26.702350821 +0000 UTC m=+0.030532843 system refresh
Dec 05 05:52:26 compute-0 sudo[46516]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:27 compute-0 sudo[46676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idsagvkhkbukzlamziajcdyguuwnqath ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913946.8877418-198-34979899164515/AnsiballZ_stat.py'
Dec 05 05:52:27 compute-0 sudo[46676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:27 compute-0 python3.9[46678]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:52:27 compute-0 sudo[46676]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:27 compute-0 sudo[46799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbcrpjzefksppuzcvatzeaqcrnpadblw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913946.8877418-198-34979899164515/AnsiballZ_copy.py'
Dec 05 05:52:27 compute-0 sudo[46799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:52:27 compute-0 python3.9[46801]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764913946.8877418-198-34979899164515/.source.json follow=False _original_basename=podman_network_config.j2 checksum=18011d87a5be3e94223bbcb8c14d8c80e072856f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:52:27 compute-0 sudo[46799]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:28 compute-0 sudo[46951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xynifehbtvdvjvyesqnbkinaawvzozgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913947.9438999-228-145619729439049/AnsiballZ_stat.py'
Dec 05 05:52:28 compute-0 sudo[46951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:28 compute-0 python3.9[46953]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:52:28 compute-0 sudo[46951]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:28 compute-0 sudo[47074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueaymnslcbuworncnhfebocngtagdemk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913947.9438999-228-145619729439049/AnsiballZ_copy.py'
Dec 05 05:52:28 compute-0 sudo[47074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:28 compute-0 python3.9[47076]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764913947.9438999-228-145619729439049/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:52:28 compute-0 sudo[47074]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:29 compute-0 sudo[47226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acyujnaacmehfuklkytlvoaestxwdsdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913948.8267248-260-128869849083361/AnsiballZ_ini_file.py'
Dec 05 05:52:29 compute-0 sudo[47226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:29 compute-0 python3.9[47228]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:52:29 compute-0 sudo[47226]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:29 compute-0 sudo[47378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axvuqwflxzciemfwwcxjhdqiehqotmgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913949.3781605-260-209088365086961/AnsiballZ_ini_file.py'
Dec 05 05:52:29 compute-0 sudo[47378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:29 compute-0 python3.9[47380]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:52:29 compute-0 sudo[47378]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:29 compute-0 sudo[47530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieycapybcthvrvxuyublnpwyrgjdicwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913949.7891085-260-144096230018780/AnsiballZ_ini_file.py'
Dec 05 05:52:29 compute-0 sudo[47530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:30 compute-0 python3.9[47532]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:52:30 compute-0 sudo[47530]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:30 compute-0 sudo[47682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pllikxuzqifvdlwyinxsssvafskfvwng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913950.197234-260-146540892103108/AnsiballZ_ini_file.py'
Dec 05 05:52:30 compute-0 sudo[47682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:30 compute-0 python3.9[47684]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:52:30 compute-0 sudo[47682]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:31 compute-0 python3.9[47834]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:52:31 compute-0 sudo[47986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbigvnmwmndmzfifacgqhclujuncdqxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913951.3837469-340-149304053687139/AnsiballZ_dnf.py'
Dec 05 05:52:31 compute-0 sudo[47986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:31 compute-0 python3.9[47988]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:32 compute-0 sudo[47986]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:32 compute-0 sudo[48139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcahwqoxtsvlalmitdqgezgeptdzdoad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913952.8138063-356-23986096921120/AnsiballZ_dnf.py'
Dec 05 05:52:32 compute-0 sudo[48139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:33 compute-0 python3.9[48141]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:36 compute-0 sudo[48139]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:37 compute-0 sudo[48301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjffpzgeauxwyruttsvzrnhdkeyoqhtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913956.999737-376-270322343252130/AnsiballZ_dnf.py'
Dec 05 05:52:37 compute-0 sudo[48301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:37 compute-0 python3.9[48303]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:38 compute-0 sudo[48301]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:38 compute-0 sudo[48454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oypjasguelaxwumtodilwrlhkkxpgppb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913958.5098267-394-161279712632613/AnsiballZ_dnf.py'
Dec 05 05:52:38 compute-0 sudo[48454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:38 compute-0 python3.9[48456]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:39 compute-0 sudo[48454]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:40 compute-0 sudo[48607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqwjcfldmjfjvwyyiwgicqhhloxhzwog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913960.1164966-416-258794696434836/AnsiballZ_dnf.py'
Dec 05 05:52:40 compute-0 sudo[48607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:40 compute-0 python3.9[48609]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:42 compute-0 sudo[48607]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:43 compute-0 sudo[48763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdefcokqilhwjvgwjroapwxrosjqrrkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913963.2904987-432-140881791144869/AnsiballZ_dnf.py'
Dec 05 05:52:43 compute-0 sudo[48763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:43 compute-0 python3.9[48765]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:50 compute-0 sudo[48763]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:53 compute-0 sudo[48933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvhyvtqmjdtjjvexshrygglvykfgbnfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913973.6559775-450-68393925905910/AnsiballZ_dnf.py'
Dec 05 05:52:53 compute-0 sudo[48933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:54 compute-0 python3.9[48935]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:52:54 compute-0 sudo[48933]: pam_unix(sudo:session): session closed for user root
Dec 05 05:52:55 compute-0 sudo[49086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pewkpsfpmeznwsctfwcacamssbpnupys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913975.1854336-468-9788236099445/AnsiballZ_dnf.py'
Dec 05 05:52:55 compute-0 sudo[49086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:52:55 compute-0 python3.9[49088]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:53:14 compute-0 sudo[49086]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:16 compute-0 sudo[49423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfoaggwfplzraseaooecehinijhmiosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913995.8732827-486-164545568307277/AnsiballZ_dnf.py'
Dec 05 05:53:16 compute-0 sudo[49423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:16 compute-0 python3.9[49425]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:53:17 compute-0 sudo[49423]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:17 compute-0 sudo[49579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrjfjfbqswuccnnvqzjvxgysjaxcxknw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913997.527882-508-238839728432492/AnsiballZ_file.py'
Dec 05 05:53:17 compute-0 sudo[49579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:17 compute-0 python3.9[49581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:53:17 compute-0 sudo[49579]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:18 compute-0 sudo[49754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ummgkikvrkoiuiqzvzhfmqposldtjiyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913998.0183759-524-242718235960391/AnsiballZ_stat.py'
Dec 05 05:53:18 compute-0 sudo[49754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:18 compute-0 python3.9[49756]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:53:18 compute-0 sudo[49754]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:18 compute-0 sudo[49877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfczpsibdpzhdfemewpskvgmefevapom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913998.0183759-524-242718235960391/AnsiballZ_copy.py'
Dec 05 05:53:18 compute-0 sudo[49877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:18 compute-0 python3.9[49879]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764913998.0183759-524-242718235960391/.source.json _original_basename=.vzcu9phd follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:53:18 compute-0 sudo[49877]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:19 compute-0 sudo[50029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekvgkyjuhxngjtrbeeraoiidsihlmdun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764913999.0823255-560-70836474411220/AnsiballZ_podman_image.py'
Dec 05 05:53:19 compute-0 sudo[50029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:19 compute-0 python3.9[50031]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:53:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat156865084-merged.mount: Deactivated successfully.
Dec 05 05:53:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat156865084-lower\x2dmapped.mount: Deactivated successfully.
Dec 05 05:53:36 compute-0 podman[50041]: 2025-12-05 05:53:36.136420028 +0000 UTC m=+16.484143067 image pull 8a34d4ae7a6c24e04826a1710ee4298adbc68547aa0db91d73c9de73375782b7 quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current
Dec 05 05:53:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:36 compute-0 sudo[50029]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:36 compute-0 sudo[50309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqzmaokkxirtytswigfjtbcndbwwyhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914016.5384548-582-71158196043956/AnsiballZ_podman_image.py'
Dec 05 05:53:36 compute-0 sudo[50309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:36 compute-0 python3.9[50311]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:53:50 compute-0 podman[50321]: 2025-12-05 05:53:50.515105087 +0000 UTC m=+13.605367086 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 05:53:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:50 compute-0 sudo[50309]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:51 compute-0 sudo[50588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdnyvbivduyisdhzjgvbudyfttnnsuhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914030.860778-602-46372776262368/AnsiballZ_podman_image.py'
Dec 05 05:53:51 compute-0 sudo[50588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:51 compute-0 python3.9[50590]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:53:52 compute-0 podman[50600]: 2025-12-05 05:53:52.584030457 +0000 UTC m=+1.350681223 image pull e33420805289bc187306032371d5d431ac611775aa0ba0a9b90183e961a97dc0 quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current
Dec 05 05:53:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:53:52 compute-0 sudo[50588]: pam_unix(sudo:session): session closed for user root
Dec 05 05:53:53 compute-0 sudo[50813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqtnyhjnxfkpgogxehqqpwnyytpqqjqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914032.937835-620-249180532934400/AnsiballZ_podman_image.py'
Dec 05 05:53:53 compute-0 sudo[50813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:53:53 compute-0 python3.9[50815]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:54:31 compute-0 podman[50825]: 2025-12-05 05:54:31.815605367 +0000 UTC m=+38.502917283 image pull b8877984ba66cc23b3665a3bbc064555c69577db52335d87bee5ea0e0b4830bc quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current
Dec 05 05:54:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:31 compute-0 sudo[50813]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:32 compute-0 sudo[51057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkfqeexwfqfnwzhbvsqgzzxedknjhqwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914072.185846-642-232294531658399/AnsiballZ_podman_image.py'
Dec 05 05:54:32 compute-0 sudo[51057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:32 compute-0 python3.9[51059]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:54:38 compute-0 podman[51069]: 2025-12-05 05:54:38.060778428 +0000 UTC m=+5.508857867 image pull 7e75ee903cbf5a3a609785d20f8960c2e0ae07869b58c22d9c011c7531639f47 quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current
Dec 05 05:54:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:38 compute-0 sudo[51057]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:38 compute-0 sudo[51302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tytwsabcalgwqlkbmotwzhwcnsmhanmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914078.290635-642-111197351188292/AnsiballZ_podman_image.py'
Dec 05 05:54:38 compute-0 sudo[51302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:38 compute-0 python3.9[51304]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 05 05:54:40 compute-0 podman[51314]: 2025-12-05 05:54:40.003586434 +0000 UTC m=+1.338886557 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Dec 05 05:54:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:54:40 compute-0 sudo[51302]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:40 compute-0 sshd-session[45191]: Connection closed by 192.168.122.30 port 58556
Dec 05 05:54:40 compute-0 sshd-session[45188]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:54:40 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 05 05:54:40 compute-0 systemd[1]: session-10.scope: Consumed 1min 38.809s CPU time.
Dec 05 05:54:40 compute-0 systemd-logind[745]: Session 10 logged out. Waiting for processes to exit.
Dec 05 05:54:40 compute-0 systemd-logind[745]: Removed session 10.
Dec 05 05:54:46 compute-0 sshd-session[51437]: Accepted publickey for zuul from 192.168.122.30 port 51090 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:54:46 compute-0 systemd-logind[745]: New session 11 of user zuul.
Dec 05 05:54:46 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 05 05:54:46 compute-0 sshd-session[51437]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:54:46 compute-0 python3.9[51590]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:54:47 compute-0 sudo[51744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpbyoicgzzcfrnayskklqwqpjeoivbtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914087.306123-52-94186854828058/AnsiballZ_getent.py'
Dec 05 05:54:47 compute-0 sudo[51744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:47 compute-0 python3.9[51746]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 05 05:54:47 compute-0 sudo[51744]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:48 compute-0 sudo[51897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vysydyasfhjjoiwrckgcyzvldlvbsbbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914087.8483882-68-257888353712407/AnsiballZ_group.py'
Dec 05 05:54:48 compute-0 sudo[51897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:48 compute-0 python3.9[51899]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 05:54:48 compute-0 groupadd[51900]: group added to /etc/group: name=openvswitch, GID=42476
Dec 05 05:54:48 compute-0 groupadd[51900]: group added to /etc/gshadow: name=openvswitch
Dec 05 05:54:48 compute-0 groupadd[51900]: new group: name=openvswitch, GID=42476
Dec 05 05:54:48 compute-0 sudo[51897]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:48 compute-0 sudo[52055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngodiaxtzilrxofeqvexwuecziutoohj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914088.4442165-84-166235503293585/AnsiballZ_user.py'
Dec 05 05:54:48 compute-0 sudo[52055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:48 compute-0 python3.9[52057]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 05:54:48 compute-0 useradd[52059]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 05:54:48 compute-0 useradd[52059]: add 'openvswitch' to group 'hugetlbfs'
Dec 05 05:54:48 compute-0 useradd[52059]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 05 05:54:48 compute-0 sudo[52055]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:49 compute-0 sudo[52215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gabcnibekhwgdmtostljkjapjzpiebve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914089.162351-104-197295316198018/AnsiballZ_setup.py'
Dec 05 05:54:49 compute-0 sudo[52215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:49 compute-0 python3.9[52217]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:54:49 compute-0 sudo[52215]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:50 compute-0 sudo[52299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blkbsnpeodcqecrnufhgecjsimabqmso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914089.162351-104-197295316198018/AnsiballZ_dnf.py'
Dec 05 05:54:50 compute-0 sudo[52299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:50 compute-0 python3.9[52301]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:54:54 compute-0 sudo[52299]: pam_unix(sudo:session): session closed for user root
Dec 05 05:54:54 compute-0 sudo[52461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iamggmmqyeicyswvtvywgmbdjzcpqybd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914094.259677-132-24454218360869/AnsiballZ_dnf.py'
Dec 05 05:54:54 compute-0 sudo[52461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:54:54 compute-0 python3.9[52463]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:55:02 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:55:02 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:55:02 compute-0 groupadd[52486]: group added to /etc/group: name=unbound, GID=993
Dec 05 05:55:02 compute-0 groupadd[52486]: group added to /etc/gshadow: name=unbound
Dec 05 05:55:02 compute-0 groupadd[52486]: new group: name=unbound, GID=993
Dec 05 05:55:02 compute-0 useradd[52493]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 05 05:55:02 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 05 05:55:02 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 05 05:55:03 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:55:03 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:55:03 compute-0 systemd[1]: Reloading.
Dec 05 05:55:03 compute-0 systemd-rc-local-generator[52985]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:55:03 compute-0 systemd-sysv-generator[52988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:55:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:55:04 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:55:04 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:55:04 compute-0 systemd[1]: run-r6c246628babb40f68cea74dc8249fbb2.service: Deactivated successfully.
Dec 05 05:55:04 compute-0 sudo[52461]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:05 compute-0 sudo[53559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwvsaywvlpquamkgjahdijvrfhypgxsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914104.9916785-148-261747890207170/AnsiballZ_systemd.py'
Dec 05 05:55:05 compute-0 sudo[53559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:05 compute-0 python3.9[53561]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 05:55:05 compute-0 systemd[1]: Reloading.
Dec 05 05:55:05 compute-0 systemd-rc-local-generator[53584]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:55:05 compute-0 systemd-sysv-generator[53587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:55:05 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 05 05:55:05 compute-0 chown[53603]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 05 05:55:05 compute-0 ovs-ctl[53608]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 05 05:55:05 compute-0 ovs-ctl[53608]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 05 05:55:05 compute-0 ovs-ctl[53608]: Starting ovsdb-server [  OK  ]
Dec 05 05:55:05 compute-0 ovs-vsctl[53657]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 05 05:55:06 compute-0 ovs-vsctl[53677]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"89d40815-76f5-4f1d-9077-84d831b7d6c4\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 05 05:55:06 compute-0 ovs-ctl[53608]: Configuring Open vSwitch system IDs [  OK  ]
Dec 05 05:55:06 compute-0 ovs-ctl[53608]: Enabling remote OVSDB managers [  OK  ]
Dec 05 05:55:06 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 05 05:55:06 compute-0 ovs-vsctl[53683]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 05 05:55:06 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 05 05:55:06 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 05 05:55:06 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 05 05:55:06 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 05 05:55:06 compute-0 ovs-ctl[53728]: Inserting openvswitch module [  OK  ]
Dec 05 05:55:06 compute-0 ovs-ctl[53697]: Starting ovs-vswitchd [  OK  ]
Dec 05 05:55:06 compute-0 ovs-ctl[53697]: Enabling remote OVSDB managers [  OK  ]
Dec 05 05:55:06 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 05 05:55:06 compute-0 ovs-vsctl[53746]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 05 05:55:06 compute-0 systemd[1]: Starting Open vSwitch...
Dec 05 05:55:06 compute-0 systemd[1]: Finished Open vSwitch.
Dec 05 05:55:06 compute-0 sudo[53559]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:06 compute-0 python3.9[53897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:55:07 compute-0 sudo[54047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwdmsdtgrclgaqijfdmulssoddgxbbnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914106.9942532-184-199528064640134/AnsiballZ_sefcontext.py'
Dec 05 05:55:07 compute-0 sudo[54047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:07 compute-0 python3.9[54049]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 05 05:55:08 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 05:55:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 05:55:08 compute-0 sudo[54047]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:08 compute-0 python3.9[54204]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:55:09 compute-0 sudo[54360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhecjxyeockrtajnppprdvukoowtdycu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914109.301783-220-105333297470291/AnsiballZ_dnf.py'
Dec 05 05:55:09 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 05 05:55:09 compute-0 sudo[54360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:09 compute-0 python3.9[54362]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:55:10 compute-0 sudo[54360]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:10 compute-0 sudo[54513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiyhbxadbarmondznlrnpcpnnfvppjcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914110.708647-236-261528498185624/AnsiballZ_command.py'
Dec 05 05:55:10 compute-0 sudo[54513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:11 compute-0 python3.9[54515]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:55:11 compute-0 sudo[54513]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:12 compute-0 sudo[54800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtnuwuxgivkglwvoeskephzcehnmjkyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914111.7627885-252-248141048840969/AnsiballZ_file.py'
Dec 05 05:55:12 compute-0 sudo[54800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:12 compute-0 python3.9[54802]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 05:55:12 compute-0 sudo[54800]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:12 compute-0 python3.9[54952]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:55:13 compute-0 sudo[55104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxjwkgurweimbnmeolnexssbsjscvyve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914112.8634987-284-266360835396508/AnsiballZ_dnf.py'
Dec 05 05:55:13 compute-0 sudo[55104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:13 compute-0 python3.9[55106]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:55:16 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:55:16 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:55:16 compute-0 systemd[1]: Reloading.
Dec 05 05:55:17 compute-0 systemd-sysv-generator[55148]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:55:17 compute-0 systemd-rc-local-generator[55145]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:55:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:55:17 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:55:17 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:55:17 compute-0 systemd[1]: run-r39b364ed15f9422183ffae313e4c63b5.service: Deactivated successfully.
Dec 05 05:55:17 compute-0 sudo[55104]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:17 compute-0 sudo[55422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvxohjtdjikwogznrrdjjkcvpvkvocwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914117.7282484-300-253836404484073/AnsiballZ_systemd.py'
Dec 05 05:55:17 compute-0 sudo[55422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:18 compute-0 python3.9[55424]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:55:18 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 05 05:55:18 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 05 05:55:18 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1588] caught SIGTERM, shutting down normally.
Dec 05 05:55:18 compute-0 systemd[1]: Stopping Network Manager...
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1600] dhcp4 (eth0): canceled DHCP transaction
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1602] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1602] dhcp4 (eth0): state changed no lease
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1603] dhcp6 (eth0): canceled DHCP transaction
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1604] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1604] dhcp6 (eth0): state changed no lease
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1607] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 05:55:18 compute-0 NetworkManager[7250]: <info>  [1764914118.1635] exiting (success)
Dec 05 05:55:18 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:55:18 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:55:18 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 05 05:55:18 compute-0 systemd[1]: Stopped Network Manager.
Dec 05 05:55:18 compute-0 systemd[1]: Starting Network Manager...
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2015] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:81faa267-a78f-40df-a39b-c2f64c67c1ec)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2017] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2057] manager[0x5622b6ba6090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 05 05:55:18 compute-0 systemd[1]: Starting Hostname Service...
Dec 05 05:55:18 compute-0 systemd[1]: Started Hostname Service.
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2667] hostname: hostname: using hostnamed
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2668] hostname: static hostname changed from (none) to "compute-0"
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2670] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2673] manager[0x5622b6ba6090]: rfkill: Wi-Fi hardware radio set enabled
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2673] manager[0x5622b6ba6090]: rfkill: WWAN hardware radio set enabled
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2689] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2696] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2696] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2696] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2697] manager: Networking is enabled by state file
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2698] settings: Loaded settings plugin: keyfile (internal)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2701] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2721] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2728] dhcp: init: Using DHCP client 'internal'
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2730] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2733] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2737] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2743] device (lo): Activation: starting connection 'lo' (fab12bac-354a-4d96-acbd-38603c43f0c0)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2749] device (eth0): carrier: link connected
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2752] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2755] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2756] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2760] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2764] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2768] device (eth1): carrier: link connected
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2771] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2774] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e) (indicated)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2774] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2778] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2782] device (eth1): Activation: starting connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2787] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 05 05:55:18 compute-0 systemd[1]: Started Network Manager.
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2792] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2793] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2794] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2796] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2797] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2799] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2800] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2801] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2805] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2807] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2809] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2814] policy: set 'System eth0' (eth0) as default for IPv6 routing and DNS
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2817] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2822] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2839] dhcp4 (eth0): state changed new lease, address=192.168.25.227
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2843] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2861] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2862] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2863] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2866] device (lo): Activation: successful, device activated.
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2870] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2872] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 05 05:55:18 compute-0 NetworkManager[55434]: <info>  [1764914118.2874] device (eth1): Activation: successful, device activated.
Dec 05 05:55:18 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 05 05:55:18 compute-0 sudo[55422]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:18 compute-0 sudo[55631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuyhlidbyoldxgzoqlqlgeliuiyiesxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914118.451035-316-165095527630817/AnsiballZ_dnf.py'
Dec 05 05:55:18 compute-0 sudo[55631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:18 compute-0 python3.9[55633]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3683] dhcp6 (eth0): state changed new lease, address=2001:db8::242
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3693] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3718] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3719] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3721] manager: NetworkManager state is now CONNECTED_SITE
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3724] device (eth0): Activation: successful, device activated.
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3727] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 05 05:55:19 compute-0 NetworkManager[55434]: <info>  [1764914119.3728] manager: startup complete
Dec 05 05:55:19 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 05 05:55:25 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 05:55:25 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 05:55:25 compute-0 systemd[1]: Reloading.
Dec 05 05:55:25 compute-0 systemd-sysv-generator[55702]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:55:25 compute-0 systemd-rc-local-generator[55699]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:55:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 05:55:26 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 05:55:26 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 05:55:26 compute-0 systemd[1]: run-r589bb96e45924c2e84b46656d90ed88e.service: Deactivated successfully.
Dec 05 05:55:26 compute-0 sudo[55631]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:29 compute-0 sudo[56110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijqwmulshmcfqkhhzxelkvyfabvzqzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914129.2259817-340-230059220636567/AnsiballZ_stat.py'
Dec 05 05:55:29 compute-0 sudo[56110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:29 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:55:29 compute-0 python3.9[56112]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:55:29 compute-0 sudo[56110]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:29 compute-0 sudo[56262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylyumdhzjagewnmfgbayjhlyqbghoych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914129.68908-358-121559053282150/AnsiballZ_ini_file.py'
Dec 05 05:55:29 compute-0 sudo[56262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:30 compute-0 python3.9[56264]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:30 compute-0 sudo[56262]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:30 compute-0 sudo[56416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odmtotpdcthsmstujnssqkciebmhnftp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914130.327067-378-123854457192729/AnsiballZ_ini_file.py'
Dec 05 05:55:30 compute-0 sudo[56416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:30 compute-0 python3.9[56418]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:30 compute-0 sudo[56416]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:30 compute-0 sudo[56568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbwtclfwwmruezllnjgroqjxjlmozyxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914130.7532246-378-55962585070921/AnsiballZ_ini_file.py'
Dec 05 05:55:30 compute-0 sudo[56568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:31 compute-0 python3.9[56570]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:31 compute-0 sudo[56568]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:31 compute-0 sudo[56722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhiagbmfgpybemoabusuqqbjuufnhhai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914131.207289-408-76792278849168/AnsiballZ_ini_file.py'
Dec 05 05:55:31 compute-0 sudo[56722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:31 compute-0 python3.9[56724]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:31 compute-0 sudo[56722]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:31 compute-0 sudo[56874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patjrytqhoipswdgmucqahlchiojexzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914131.639339-408-235924875595188/AnsiballZ_ini_file.py'
Dec 05 05:55:31 compute-0 sudo[56874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:31 compute-0 python3.9[56876]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:31 compute-0 sudo[56874]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:32 compute-0 sudo[57026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oaadlsfmacpcizyxjppypdjhgfwgbwlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914132.0927305-438-22657271445213/AnsiballZ_stat.py'
Dec 05 05:55:32 compute-0 sudo[57026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:32 compute-0 python3.9[57028]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:55:32 compute-0 sudo[57026]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:32 compute-0 sudo[57149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnzmaeyiyzqzagzuzdlgkrwatejthdnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914132.0927305-438-22657271445213/AnsiballZ_copy.py'
Dec 05 05:55:32 compute-0 sudo[57149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:32 compute-0 python3.9[57151]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914132.0927305-438-22657271445213/.source _original_basename=.44y887d4 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:33 compute-0 sudo[57149]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:33 compute-0 sudo[57301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqxovhlbctrqxplfgszfzrrrxmshahkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914133.1174412-468-42078983066671/AnsiballZ_file.py'
Dec 05 05:55:33 compute-0 sudo[57301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:33 compute-0 python3.9[57303]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:33 compute-0 sudo[57301]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:33 compute-0 sudo[57453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msuzvharohimygsvobqtlbhbqphzkfjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914133.5659187-484-121381714731085/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 05 05:55:33 compute-0 sudo[57453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:33 compute-0 python3.9[57455]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 05 05:55:34 compute-0 sudo[57453]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:34 compute-0 sudo[57605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsqbohtyyuuzlwyouuarnxaooyzzxcdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914134.1674452-502-22567257142110/AnsiballZ_file.py'
Dec 05 05:55:34 compute-0 sudo[57605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:34 compute-0 python3.9[57607]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:34 compute-0 sudo[57605]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:34 compute-0 sudo[57757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwkuawpratzybpjvyxgerecdnobafcho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914134.747952-522-100506350355627/AnsiballZ_stat.py'
Dec 05 05:55:34 compute-0 sudo[57757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:35 compute-0 sudo[57757]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:35 compute-0 sudo[57880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lckfdjaerzdognwjzirjvaezjppklrts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914134.747952-522-100506350355627/AnsiballZ_copy.py'
Dec 05 05:55:35 compute-0 sudo[57880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:35 compute-0 sudo[57880]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:35 compute-0 sudo[58032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsuhuxatbnwbjffsjcwefzrnvzsvoetn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914135.6145387-552-174049249093582/AnsiballZ_slurp.py'
Dec 05 05:55:35 compute-0 sudo[58032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:36 compute-0 python3.9[58034]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 05 05:55:36 compute-0 sudo[58032]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:36 compute-0 sudo[58207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahjvmodjvygxrbxwznnvpwaihprdcoek ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914136.238823-570-217287345791887/async_wrapper.py j849148108798 300 /home/zuul/.ansible/tmp/ansible-tmp-1764914136.238823-570-217287345791887/AnsiballZ_edpm_os_net_config.py _'
Dec 05 05:55:36 compute-0 sudo[58207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:36 compute-0 ansible-async_wrapper.py[58209]: Invoked with j849148108798 300 /home/zuul/.ansible/tmp/ansible-tmp-1764914136.238823-570-217287345791887/AnsiballZ_edpm_os_net_config.py _
Dec 05 05:55:36 compute-0 ansible-async_wrapper.py[58212]: Starting module and watcher
Dec 05 05:55:36 compute-0 ansible-async_wrapper.py[58212]: Start watching 58213 (300)
Dec 05 05:55:36 compute-0 ansible-async_wrapper.py[58213]: Start module (58213)
Dec 05 05:55:36 compute-0 ansible-async_wrapper.py[58209]: Return async_wrapper task started.
Dec 05 05:55:36 compute-0 sudo[58207]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:37 compute-0 python3.9[58214]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 05 05:55:37 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 05 05:55:37 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 05 05:55:37 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 05 05:55:37 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 05 05:55:37 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2235] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2246] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2620] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2620] audit: op="connection-add" uuid="86228ae3-d3c4-45d4-b4d4-562355bc8201" name="br-ex-br" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2631] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2632] audit: op="connection-add" uuid="5f31f202-73a7-458c-8167-65eedb5ddf18" name="br-ex-port" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2640] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2641] audit: op="connection-add" uuid="66eef391-2730-4453-9b3a-4c6e66268bce" name="eth1-port" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2649] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2650] audit: op="connection-add" uuid="b15d3733-144f-4a2a-b607-7671ab62d6ae" name="vlan20-port" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2658] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2659] audit: op="connection-add" uuid="363c2aa1-6412-4c52-bdf5-ea3c7e23146f" name="vlan21-port" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2667] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2667] audit: op="connection-add" uuid="047c2ab6-05bd-456d-b8c7-9147ee8a74e7" name="vlan22-port" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2682] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.dhcp-timeout,ipv6.may-fail,ipv6.addr-gen-mode,ipv6.routes,connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2694] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2695] audit: op="connection-add" uuid="6aa16c72-3642-4ede-9a65-ffa6b8a772f1" name="br-ex-if" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2714] audit: op="connection-update" uuid="b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e" name="ci-private-network" args="ipv6.method,ipv6.addresses,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,connection.controller,connection.master,connection.port-type,connection.slave-type,connection.timestamp,ipv4.method,ipv4.addresses,ipv4.routes,ipv4.never-default,ipv4.dns,ipv4.routing-rules,ovs-interface.type,ovs-external-ids.data" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2726] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2727] audit: op="connection-add" uuid="302196a9-8ae5-4e19-8395-de085a6574a6" name="vlan20-if" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2737] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2739] audit: op="connection-add" uuid="042a5890-d74d-4c0c-83de-75756c07bfe0" name="vlan21-if" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2749] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2750] audit: op="connection-add" uuid="fc456046-18ab-46ab-844d-9f85fe114bee" name="vlan22-if" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2758] audit: op="connection-delete" uuid="df66603f-7662-3e7d-b7d2-33281c48b328" name="Wired connection 1" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2766] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2772] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2774] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (86228ae3-d3c4-45d4-b4d4-562355bc8201)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2774] audit: op="connection-activate" uuid="86228ae3-d3c4-45d4-b4d4-562355bc8201" name="br-ex-br" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2776] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2780] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2782] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (5f31f202-73a7-458c-8167-65eedb5ddf18)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2783] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2787] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2789] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (66eef391-2730-4453-9b3a-4c6e66268bce)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2790] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2794] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2796] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (b15d3733-144f-4a2a-b607-7671ab62d6ae)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2797] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2801] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2803] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (363c2aa1-6412-4c52-bdf5-ea3c7e23146f)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2804] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2809] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2811] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (047c2ab6-05bd-456d-b8c7-9147ee8a74e7)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2812] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2813] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2814] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2818] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2821] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2823] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (6aa16c72-3642-4ede-9a65-ffa6b8a772f1)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2824] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2826] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2827] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2827] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2828] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2835] device (eth1): disconnecting for new activation request.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2835] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2837] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2837] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2838] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2840] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2842] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2845] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (302196a9-8ae5-4e19-8395-de085a6574a6)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2845] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2846] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2848] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2848] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2850] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2852] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2855] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (042a5890-d74d-4c0c-83de-75756c07bfe0)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2855] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2857] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2858] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2858] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2860] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2862] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2864] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (fc456046-18ab-46ab-844d-9f85fe114bee)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2865] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2866] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2867] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2868] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2869] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2877] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.may-fail,ipv6.routes,ipv6.addr-gen-mode,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2878] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2880] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2883] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2887] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2890] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2892] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2896] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2897] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2900] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 05 05:55:38 compute-0 kernel: Timeout policy base is empty
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2902] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2903] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2904] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2907] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2909] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2911] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2912] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2915] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2917] dhcp4 (eth0): canceled DHCP transaction
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2917] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2917] dhcp4 (eth0): state changed no lease
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2917] dhcp6 (eth0): canceled DHCP transaction
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2917] dhcp6 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2918] dhcp6 (eth0): state changed no lease
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2921] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2929] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.2932] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58215 uid=0 result="fail" reason="Device is not activated"
Dec 05 05:55:38 compute-0 systemd-udevd[58219]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:55:38 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 05 05:55:38 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3030] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3033] dhcp4 (eth0): state changed new lease, address=192.168.25.227
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3086] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 05 05:55:38 compute-0 kernel: br-ex: entered promiscuous mode
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3153] device (eth1): Activation: starting connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3155] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3159] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3165] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3168] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3170] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3173] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3176] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3185] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3186] device (eth1): released from controller device eth1
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3192] device (eth1): disconnecting for new activation request.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3193] audit: op="connection-activate" uuid="b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e" name="ci-private-network" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3193] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3194] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3195] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3196] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3196] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3200] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3207] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3210] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3214] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3217] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3220] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3225] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3228] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3230] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3233] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3245] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3245] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58215 uid=0 result="success"
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3246] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3251] device (eth1): Activation: starting connection 'ci-private-network' (b1f40c66-9ec5-5a4e-aa92-ad1bab91b90e)
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3254] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3256] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3260] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 kernel: vlan20: entered promiscuous mode
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3285] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3288] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3298] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3333] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3335] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 kernel: vlan21: entered promiscuous mode
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3350] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3387] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3388] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 systemd-udevd[58221]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:55:38 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3401] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3406] device (eth1): Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3446] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3459] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3460] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3465] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3488] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 05 05:55:38 compute-0 kernel: vlan22: entered promiscuous mode
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3522] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3541] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3542] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3546] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3626] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3634] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3655] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3657] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 05 05:55:38 compute-0 NetworkManager[55434]: <info>  [1764914138.3660] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.4516] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58215 uid=0 result="success"
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.5456] checkpoint[0x5622b6b7d950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.5458] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58215 uid=0 result="success"
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.6431] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58215 uid=0 result="success"
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.6441] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58215 uid=0 result="success"
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.7754] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58215 uid=0 result="success"
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.8699] checkpoint[0x5622b6b7da20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 05 05:55:39 compute-0 NetworkManager[55434]: <info>  [1764914139.8702] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.0807] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.0817] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.2373] audit: op="networking-control" arg="global-dns-configuration" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.2387] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf)
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.2392] audit: op="networking-control" arg="global-dns-configuration" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.2410] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 sudo[58552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceskvqptnuuzbukcbrqokovhyrkcyxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914139.9606721-570-110301230131796/AnsiballZ_async_status.py'
Dec 05 05:55:40 compute-0 sudo[58552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.3402] checkpoint[0x5622b6b7daf0]: destroy /org/freedesktop/NetworkManager/Checkpoint/3
Dec 05 05:55:40 compute-0 NetworkManager[55434]: <info>  [1764914140.3405] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/3" pid=58215 uid=0 result="success"
Dec 05 05:55:40 compute-0 ansible-async_wrapper.py[58213]: Module complete (58213)
Dec 05 05:55:40 compute-0 python3.9[58554]: ansible-ansible.legacy.async_status Invoked with jid=j849148108798.58209 mode=status _async_dir=/root/.ansible_async
Dec 05 05:55:40 compute-0 sudo[58552]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:40 compute-0 sudo[58651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mobgceqxtivodlwsadmmxronolfpdrwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914139.9606721-570-110301230131796/AnsiballZ_async_status.py'
Dec 05 05:55:40 compute-0 sudo[58651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:40 compute-0 python3.9[58653]: ansible-ansible.legacy.async_status Invoked with jid=j849148108798.58209 mode=cleanup _async_dir=/root/.ansible_async
Dec 05 05:55:40 compute-0 sudo[58651]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:41 compute-0 ansible-async_wrapper.py[58212]: Done in kid B.
Dec 05 05:55:44 compute-0 sudo[58805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beiqirwkugisxggmkqytnlwjzytbqmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914144.3074543-619-100648223027565/AnsiballZ_stat.py'
Dec 05 05:55:44 compute-0 sudo[58805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:44 compute-0 python3.9[58807]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:55:44 compute-0 sudo[58805]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:44 compute-0 sudo[58928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-repaprjniliamhmqoelslvwppkbjrkhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914144.3074543-619-100648223027565/AnsiballZ_copy.py'
Dec 05 05:55:44 compute-0 sudo[58928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:44 compute-0 python3.9[58930]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914144.3074543-619-100648223027565/.source.returncode _original_basename=.9_8xt8mt follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:45 compute-0 sudo[58928]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:45 compute-0 sudo[59080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aieljrulsaftfhuyzofsuuorntlbmnux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914145.192124-651-35499054974699/AnsiballZ_stat.py'
Dec 05 05:55:45 compute-0 sudo[59080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:45 compute-0 python3.9[59082]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:55:45 compute-0 sudo[59080]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:45 compute-0 sudo[59203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyztncleuzfoewzgnskzeltkjomkzbrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914145.192124-651-35499054974699/AnsiballZ_copy.py'
Dec 05 05:55:45 compute-0 sudo[59203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:45 compute-0 python3.9[59205]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914145.192124-651-35499054974699/.source.cfg _original_basename=.7ciea1mk follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:55:45 compute-0 sudo[59203]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:46 compute-0 sudo[59355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bojgqqnvzdheanfkrpnaivjrymoxvqlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914146.0700731-681-173538294127148/AnsiballZ_systemd.py'
Dec 05 05:55:46 compute-0 sudo[59355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:55:46 compute-0 python3.9[59357]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:55:46 compute-0 systemd[1]: Reloading Network Manager...
Dec 05 05:55:46 compute-0 NetworkManager[55434]: <info>  [1764914146.5750] audit: op="reload" arg="0" pid=59361 uid=0 result="success"
Dec 05 05:55:46 compute-0 NetworkManager[55434]: <info>  [1764914146.5755] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /etc/NetworkManager/conf.d/99-cloud-init.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 05 05:55:46 compute-0 NetworkManager[55434]: <info>  [1764914146.5756] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 05 05:55:46 compute-0 systemd[1]: Reloaded Network Manager.
Dec 05 05:55:46 compute-0 sudo[59355]: pam_unix(sudo:session): session closed for user root
Dec 05 05:55:46 compute-0 sshd-session[51440]: Connection closed by 192.168.122.30 port 51090
Dec 05 05:55:47 compute-0 sshd-session[51437]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:55:47 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 05 05:55:47 compute-0 systemd[1]: session-11.scope: Consumed 34.335s CPU time.
Dec 05 05:55:47 compute-0 systemd-logind[745]: Session 11 logged out. Waiting for processes to exit.
Dec 05 05:55:47 compute-0 systemd-logind[745]: Removed session 11.
Dec 05 05:55:48 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 05 05:55:51 compute-0 sshd-session[59394]: Accepted publickey for zuul from 192.168.122.30 port 37520 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:55:51 compute-0 systemd-logind[745]: New session 12 of user zuul.
Dec 05 05:55:51 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 05 05:55:51 compute-0 sshd-session[59394]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:55:52 compute-0 python3.9[59547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:55:53 compute-0 python3.9[59701]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:55:54 compute-0 python3.9[59891]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:55:54 compute-0 sshd-session[59397]: Connection closed by 192.168.122.30 port 37520
Dec 05 05:55:54 compute-0 sshd-session[59394]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:55:54 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 05 05:55:54 compute-0 systemd[1]: session-12.scope: Consumed 1.598s CPU time.
Dec 05 05:55:54 compute-0 systemd-logind[745]: Session 12 logged out. Waiting for processes to exit.
Dec 05 05:55:54 compute-0 systemd-logind[745]: Removed session 12.
Dec 05 05:55:56 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 05 05:55:59 compute-0 sshd-session[59920]: Accepted publickey for zuul from 192.168.122.30 port 51426 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:55:59 compute-0 systemd-logind[745]: New session 13 of user zuul.
Dec 05 05:55:59 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 05 05:55:59 compute-0 sshd-session[59920]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:56:00 compute-0 python3.9[60074]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:56:01 compute-0 python3.9[60228]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:56:01 compute-0 sudo[60382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcrvjqaulpmpjrsmgbusziegybxsofaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914161.4390357-60-83872776283489/AnsiballZ_setup.py'
Dec 05 05:56:01 compute-0 sudo[60382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:01 compute-0 python3.9[60384]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:56:02 compute-0 sudo[60382]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:02 compute-0 sudo[60466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyspymzzmydkoufqunlbvgpocqpfkdsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914161.4390357-60-83872776283489/AnsiballZ_dnf.py'
Dec 05 05:56:02 compute-0 sudo[60466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:02 compute-0 python3.9[60468]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:56:03 compute-0 sudo[60466]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:03 compute-0 sudo[60620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpwzgttexalnuipdpecjdvzuydqyqmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914163.5440054-84-131033785793973/AnsiballZ_setup.py'
Dec 05 05:56:03 compute-0 sudo[60620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:03 compute-0 python3.9[60622]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:56:04 compute-0 sudo[60620]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:04 compute-0 sudo[60811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzwfwwmnglmbzosbrvtjcmqdhvmnqwso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914164.370219-106-164679503450567/AnsiballZ_file.py'
Dec 05 05:56:04 compute-0 sudo[60811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:04 compute-0 python3.9[60813]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:04 compute-0 sudo[60811]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:05 compute-0 sudo[60963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odeamqhemzndeocginrsumitdgpwctef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914164.9794433-122-168578317329173/AnsiballZ_command.py'
Dec 05 05:56:05 compute-0 sudo[60963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:05 compute-0 python3.9[60965]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:56:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:56:05 compute-0 sudo[60963]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:05 compute-0 sudo[61124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-begtxrbysgpfatwwhdxqdyzmgijqvzpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914165.6150582-138-80482587159288/AnsiballZ_stat.py'
Dec 05 05:56:05 compute-0 sudo[61124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:06 compute-0 python3.9[61126]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:06 compute-0 sudo[61124]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:06 compute-0 sudo[61202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpsdyejhcwelqtztktnozqopyvmkqthx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914165.6150582-138-80482587159288/AnsiballZ_file.py'
Dec 05 05:56:06 compute-0 sudo[61202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:06 compute-0 python3.9[61204]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:06 compute-0 sudo[61202]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:06 compute-0 sudo[61355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehhetrtnhytucuqqjmiyoineeypxatiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914166.543043-162-279806969532872/AnsiballZ_stat.py'
Dec 05 05:56:06 compute-0 sudo[61355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:06 compute-0 python3.9[61357]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:06 compute-0 sudo[61355]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:07 compute-0 sudo[61433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzkfmozalrrufmhpiptinuynjjszjxeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914166.543043-162-279806969532872/AnsiballZ_file.py'
Dec 05 05:56:07 compute-0 sudo[61433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:07 compute-0 python3.9[61435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:07 compute-0 sudo[61433]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:07 compute-0 sudo[61585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cegvakzjrvudylrnhembitxaotxthvab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914167.3903532-188-54321586038417/AnsiballZ_ini_file.py'
Dec 05 05:56:07 compute-0 sudo[61585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:07 compute-0 python3.9[61587]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:07 compute-0 sudo[61585]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:08 compute-0 sudo[61737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eukaywsfcsiucillwlfmwbiyaxsgpbij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914167.9151163-188-8990201728933/AnsiballZ_ini_file.py'
Dec 05 05:56:08 compute-0 sudo[61737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:08 compute-0 python3.9[61739]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:08 compute-0 sudo[61737]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:08 compute-0 sudo[61889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohpethdaynnfgugepshfowbryuhbjbql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914168.3275683-188-102241686968201/AnsiballZ_ini_file.py'
Dec 05 05:56:08 compute-0 sudo[61889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:08 compute-0 python3.9[61891]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:08 compute-0 sudo[61889]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:08 compute-0 sudo[62041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shxstvqwdkuwaqgmdkprxvnywwuskedo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914168.7620041-188-82707541752892/AnsiballZ_ini_file.py'
Dec 05 05:56:08 compute-0 sudo[62041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:09 compute-0 python3.9[62043]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:09 compute-0 sudo[62041]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:09 compute-0 sudo[62194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzajvjozsfkhxhsrmwdikpslmrlileaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914169.4093502-250-57677064896666/AnsiballZ_dnf.py'
Dec 05 05:56:09 compute-0 sudo[62194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:09 compute-0 python3.9[62196]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:56:10 compute-0 sudo[62194]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:11 compute-0 sudo[62347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqqeqznqbphoqqzimzxnhvxhkooswqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914171.04582-272-116297399102312/AnsiballZ_setup.py'
Dec 05 05:56:11 compute-0 sudo[62347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:11 compute-0 python3.9[62349]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:56:11 compute-0 sudo[62347]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:11 compute-0 sudo[62501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jirrhgapxxtffapeexypqotyhcmhsafj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914171.615411-288-1644149423325/AnsiballZ_stat.py'
Dec 05 05:56:11 compute-0 sudo[62501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:11 compute-0 python3.9[62503]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:56:11 compute-0 sudo[62501]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:12 compute-0 sudo[62653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbwegtumxsnnsnhwhggkiqgfbwhbgzri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914172.1159809-306-73179626027388/AnsiballZ_stat.py'
Dec 05 05:56:12 compute-0 sudo[62653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:12 compute-0 python3.9[62655]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:56:12 compute-0 sudo[62653]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:12 compute-0 sudo[62805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cksvtpbdhhnujkeaieydeknvjiacgopf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914172.672595-326-138826519481514/AnsiballZ_command.py'
Dec 05 05:56:12 compute-0 sudo[62805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:12 compute-0 python3.9[62807]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:56:13 compute-0 sudo[62805]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:13 compute-0 sudo[62958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vukhvdsinnczkyhkboflggvupiuvduvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914173.239206-346-209038339858119/AnsiballZ_service_facts.py'
Dec 05 05:56:13 compute-0 sudo[62958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:13 compute-0 python3.9[62960]: ansible-service_facts Invoked
Dec 05 05:56:13 compute-0 network[62977]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 05:56:13 compute-0 network[62978]: 'network-scripts' will be removed from distribution in near future.
Dec 05 05:56:13 compute-0 network[62979]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 05:56:15 compute-0 sudo[62958]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:16 compute-0 sudo[63262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuiwjkuvfcfsurztjxcdjomfgchqkufc ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1764914176.0666602-376-167123494809046/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1764914176.0666602-376-167123494809046/args'
Dec 05 05:56:16 compute-0 sudo[63262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:16 compute-0 sudo[63262]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:16 compute-0 sudo[63429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfmyeljeospubbbmwppsmawmeudalrlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914176.552408-398-201349664818759/AnsiballZ_dnf.py'
Dec 05 05:56:16 compute-0 sudo[63429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:16 compute-0 python3.9[63431]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:56:17 compute-0 sudo[63429]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:18 compute-0 sudo[63582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkgbkzcbqomoguzwfshgqvwmwulgvwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914178.1768467-424-1480878957798/AnsiballZ_package_facts.py'
Dec 05 05:56:18 compute-0 sudo[63582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:18 compute-0 python3.9[63584]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 05 05:56:19 compute-0 sudo[63582]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:19 compute-0 sudo[63734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djuvpjrkfgwxsgvmnlosgjodornosetf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914179.6374385-444-76157015461355/AnsiballZ_stat.py'
Dec 05 05:56:19 compute-0 sudo[63734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:19 compute-0 python3.9[63736]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:20 compute-0 sudo[63734]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:20 compute-0 sudo[63859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkqddqvmxbfzlmwkubffitqvctzsdjsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914179.6374385-444-76157015461355/AnsiballZ_copy.py'
Dec 05 05:56:20 compute-0 sudo[63859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:20 compute-0 python3.9[63861]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914179.6374385-444-76157015461355/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:20 compute-0 sudo[63859]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:20 compute-0 sudo[64013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvndlamvdgipgrltlwduzmuomchinwkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914180.6368222-474-121031470192091/AnsiballZ_stat.py'
Dec 05 05:56:20 compute-0 sudo[64013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:20 compute-0 python3.9[64015]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:20 compute-0 sudo[64013]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:21 compute-0 sudo[64138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twvivogtwclznvklokurbsezosytybmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914180.6368222-474-121031470192091/AnsiballZ_copy.py'
Dec 05 05:56:21 compute-0 sudo[64138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:21 compute-0 python3.9[64140]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914180.6368222-474-121031470192091/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:21 compute-0 sudo[64138]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:22 compute-0 sudo[64292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guypalxbgahqcmvukmnltdlikexnoifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914182.02147-516-194814052063223/AnsiballZ_lineinfile.py'
Dec 05 05:56:22 compute-0 sudo[64292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:22 compute-0 python3.9[64294]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:22 compute-0 sudo[64292]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:23 compute-0 sudo[64446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ortnjrltgnkzrvudxlimcivyffrbhakl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914183.0222383-546-134170916578641/AnsiballZ_setup.py'
Dec 05 05:56:23 compute-0 sudo[64446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:23 compute-0 python3.9[64448]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:56:23 compute-0 sudo[64446]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:23 compute-0 sudo[64530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jklmyovittmlymxjmnboogcyjcnevauq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914183.0222383-546-134170916578641/AnsiballZ_systemd.py'
Dec 05 05:56:23 compute-0 sudo[64530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:24 compute-0 python3.9[64532]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:24 compute-0 sudo[64530]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:24 compute-0 sudo[64684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkjferxsavlomowvcqtlpzfzvofcaalr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914184.802106-578-142236407932151/AnsiballZ_setup.py'
Dec 05 05:56:24 compute-0 sudo[64684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:25 compute-0 python3.9[64686]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:56:25 compute-0 sudo[64684]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:25 compute-0 sudo[64768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pexqznlpbahzrtxyannutoqntokuvaoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914184.802106-578-142236407932151/AnsiballZ_systemd.py'
Dec 05 05:56:25 compute-0 sudo[64768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:25 compute-0 python3.9[64770]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:56:25 compute-0 chronyd[752]: chronyd exiting
Dec 05 05:56:25 compute-0 systemd[1]: Stopping NTP client/server...
Dec 05 05:56:25 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 05 05:56:25 compute-0 systemd[1]: Stopped NTP client/server.
Dec 05 05:56:25 compute-0 systemd[1]: Starting NTP client/server...
Dec 05 05:56:25 compute-0 chronyd[64778]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 05 05:56:25 compute-0 chronyd[64778]: Frequency -4.466 +/- 0.634 ppm read from /var/lib/chrony/drift
Dec 05 05:56:25 compute-0 chronyd[64778]: Loaded seccomp filter (level 2)
Dec 05 05:56:25 compute-0 systemd[1]: Started NTP client/server.
Dec 05 05:56:25 compute-0 sudo[64768]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:26 compute-0 sshd-session[59923]: Connection closed by 192.168.122.30 port 51426
Dec 05 05:56:26 compute-0 sshd-session[59920]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:56:26 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 05 05:56:26 compute-0 systemd[1]: session-13.scope: Consumed 17.372s CPU time.
Dec 05 05:56:26 compute-0 systemd-logind[745]: Session 13 logged out. Waiting for processes to exit.
Dec 05 05:56:26 compute-0 systemd-logind[745]: Removed session 13.
Dec 05 05:56:30 compute-0 sshd-session[64804]: Accepted publickey for zuul from 192.168.122.30 port 38042 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:56:30 compute-0 systemd-logind[745]: New session 14 of user zuul.
Dec 05 05:56:30 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 05 05:56:30 compute-0 sshd-session[64804]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:56:31 compute-0 python3.9[64957]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:56:32 compute-0 sudo[65111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilyfknceviirgqdsmegsnxhwmnyejugy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914192.009192-46-250422451107580/AnsiballZ_file.py'
Dec 05 05:56:32 compute-0 sudo[65111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:32 compute-0 python3.9[65113]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:32 compute-0 sudo[65111]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:32 compute-0 sudo[65286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnerrpvzntqvshzvxnmbkechkomilzhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914192.5732071-62-66660402764107/AnsiballZ_stat.py'
Dec 05 05:56:32 compute-0 sudo[65286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:33 compute-0 python3.9[65288]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:33 compute-0 sudo[65286]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:33 compute-0 sudo[65364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuuylcxhqjgjxnbrzxafduuxinqvdpmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914192.5732071-62-66660402764107/AnsiballZ_file.py'
Dec 05 05:56:33 compute-0 sudo[65364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:33 compute-0 python3.9[65366]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=._11blmw9 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:33 compute-0 sudo[65364]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:33 compute-0 sudo[65516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfvusomtuijzsbltloitkrvkzqbfqdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914193.6474-102-166599514083545/AnsiballZ_stat.py'
Dec 05 05:56:33 compute-0 sudo[65516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:33 compute-0 python3.9[65518]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:33 compute-0 sudo[65516]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:34 compute-0 sudo[65639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsxscjacdgsyerkfjgwetscvfeuekzgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914193.6474-102-166599514083545/AnsiballZ_copy.py'
Dec 05 05:56:34 compute-0 sudo[65639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:34 compute-0 python3.9[65641]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914193.6474-102-166599514083545/.source _original_basename=.sq9qgf3i follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:34 compute-0 sudo[65639]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:34 compute-0 sudo[65791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkaqqzbwdlahsxzastoyqlmutdrvspyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914194.5977151-134-216866726951629/AnsiballZ_file.py'
Dec 05 05:56:34 compute-0 sudo[65791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:34 compute-0 python3.9[65793]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:34 compute-0 sudo[65791]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:35 compute-0 sudo[65943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqsrpxvpgwvbsaksstmndxwcikpazgbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914195.0532944-150-123885074403936/AnsiballZ_stat.py'
Dec 05 05:56:35 compute-0 sudo[65943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:35 compute-0 python3.9[65945]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:35 compute-0 sudo[65943]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:35 compute-0 sudo[66066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irxtiyasmdnjqijdzvdfmpqxmvmnkbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914195.0532944-150-123885074403936/AnsiballZ_copy.py'
Dec 05 05:56:35 compute-0 sudo[66066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:35 compute-0 python3.9[66068]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914195.0532944-150-123885074403936/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:35 compute-0 sudo[66066]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:35 compute-0 sudo[66218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzdlpsagrnrfhwuqstunvqbgzvgxsns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914195.8221772-150-208854719540571/AnsiballZ_stat.py'
Dec 05 05:56:35 compute-0 sudo[66218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:36 compute-0 python3.9[66220]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:36 compute-0 sudo[66218]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:36 compute-0 sudo[66341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmpxgowiavjdaysdnevdyolqpygsdswq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914195.8221772-150-208854719540571/AnsiballZ_copy.py'
Dec 05 05:56:36 compute-0 sudo[66341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:36 compute-0 python3.9[66343]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914195.8221772-150-208854719540571/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:56:36 compute-0 sudo[66341]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:36 compute-0 sudo[66493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adverkojawwnivrjyzymfhdhaqcoorvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914196.665223-208-36726292918622/AnsiballZ_file.py'
Dec 05 05:56:36 compute-0 sudo[66493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:36 compute-0 python3.9[66495]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:36 compute-0 sudo[66493]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:37 compute-0 sudo[66645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tygrkjxhilggprjzcuybfblquejpsqcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914197.0960536-224-96963353874607/AnsiballZ_stat.py'
Dec 05 05:56:37 compute-0 sudo[66645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:37 compute-0 python3.9[66647]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:37 compute-0 sudo[66645]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:37 compute-0 sudo[66768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnyzgysyazelzuqxnhdefmsgfirenkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914197.0960536-224-96963353874607/AnsiballZ_copy.py'
Dec 05 05:56:37 compute-0 sudo[66768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:37 compute-0 python3.9[66770]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914197.0960536-224-96963353874607/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:37 compute-0 sudo[66768]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:38 compute-0 sudo[66920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yogrixsnfcvrqgcgxanaxklpwlsojfbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914197.8927526-254-90396899830403/AnsiballZ_stat.py'
Dec 05 05:56:38 compute-0 sudo[66920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:38 compute-0 python3.9[66922]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:38 compute-0 sudo[66920]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:38 compute-0 sudo[67043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-razxluzpunkdxronbvvzowsmkxprthju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914197.8927526-254-90396899830403/AnsiballZ_copy.py'
Dec 05 05:56:38 compute-0 sudo[67043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:38 compute-0 python3.9[67045]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914197.8927526-254-90396899830403/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:38 compute-0 sudo[67043]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:39 compute-0 sudo[67195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecyqnnnvkfifvzclsykaogaromsnubts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914198.6772163-284-99311866701968/AnsiballZ_systemd.py'
Dec 05 05:56:39 compute-0 sudo[67195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:39 compute-0 python3.9[67197]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:39 compute-0 systemd[1]: Reloading.
Dec 05 05:56:39 compute-0 systemd-sysv-generator[67227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:39 compute-0 systemd-rc-local-generator[67222]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:39 compute-0 systemd[1]: Reloading.
Dec 05 05:56:39 compute-0 systemd-rc-local-generator[67259]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:39 compute-0 systemd-sysv-generator[67263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:39 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 05 05:56:39 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 05 05:56:39 compute-0 sudo[67195]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:39 compute-0 sudo[67423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvhtryxkxgouskmrnwylfpopdqtprroa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914199.828406-300-202871348068414/AnsiballZ_stat.py'
Dec 05 05:56:39 compute-0 sudo[67423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:40 compute-0 python3.9[67425]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:40 compute-0 sudo[67423]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:40 compute-0 sudo[67546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aowkuwdwzrypnhstwockgtqfmzqphdnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914199.828406-300-202871348068414/AnsiballZ_copy.py'
Dec 05 05:56:40 compute-0 sudo[67546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:40 compute-0 python3.9[67548]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914199.828406-300-202871348068414/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:40 compute-0 sudo[67546]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:40 compute-0 sudo[67698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klyzwttbjfxhbnigsqzfjbovtxbofdmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914200.61731-330-98747738908624/AnsiballZ_stat.py'
Dec 05 05:56:40 compute-0 sudo[67698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:40 compute-0 python3.9[67700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:40 compute-0 sudo[67698]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:41 compute-0 sudo[67821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqtfzfkazuiviiyzwmobucjzsoqauktx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914200.61731-330-98747738908624/AnsiballZ_copy.py'
Dec 05 05:56:41 compute-0 sudo[67821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:41 compute-0 python3.9[67823]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914200.61731-330-98747738908624/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:41 compute-0 sudo[67821]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:41 compute-0 sudo[67973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzvpmqppcnojkwebvqfmfhhvrzjrjqbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914201.4308956-360-191665553072684/AnsiballZ_systemd.py'
Dec 05 05:56:41 compute-0 sudo[67973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:41 compute-0 python3.9[67975]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:41 compute-0 systemd[1]: Reloading.
Dec 05 05:56:41 compute-0 systemd-sysv-generator[68001]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:41 compute-0 systemd-rc-local-generator[67998]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:42 compute-0 systemd[1]: Reloading.
Dec 05 05:56:42 compute-0 systemd-rc-local-generator[68033]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:42 compute-0 systemd-sysv-generator[68036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:42 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 05:56:42 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 05:56:42 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 05:56:42 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 05:56:42 compute-0 sudo[67973]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:42 compute-0 python3.9[68202]: ansible-ansible.builtin.service_facts Invoked
Dec 05 05:56:42 compute-0 network[68219]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 05:56:42 compute-0 network[68220]: 'network-scripts' will be removed from distribution in near future.
Dec 05 05:56:42 compute-0 network[68221]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 05:56:44 compute-0 sudo[68481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkbgwbtqbfnhbzybvxtqqsuriwxvbggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914204.6969376-392-207367266239909/AnsiballZ_systemd.py'
Dec 05 05:56:44 compute-0 sudo[68481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:45 compute-0 python3.9[68483]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:45 compute-0 systemd[1]: Reloading.
Dec 05 05:56:45 compute-0 systemd-rc-local-generator[68506]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:45 compute-0 systemd-sysv-generator[68509]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:45 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 05 05:56:45 compute-0 iptables.init[68522]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 05 05:56:45 compute-0 iptables.init[68522]: iptables: Flushing firewall rules: [  OK  ]
Dec 05 05:56:45 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 05 05:56:45 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 05 05:56:45 compute-0 sudo[68481]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:45 compute-0 sudo[68716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgbgblhddmfpdhderknvtvgobcmhjhow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914205.6597605-392-229935957198380/AnsiballZ_systemd.py'
Dec 05 05:56:45 compute-0 sudo[68716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:46 compute-0 python3.9[68718]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:46 compute-0 sudo[68716]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:46 compute-0 sudo[68870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngedhaurvswvxzsdaxvukienarfsauiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914206.3045065-424-259669549817598/AnsiballZ_systemd.py'
Dec 05 05:56:46 compute-0 sudo[68870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:46 compute-0 python3.9[68872]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:56:46 compute-0 systemd[1]: Reloading.
Dec 05 05:56:46 compute-0 systemd-sysv-generator[68899]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:56:46 compute-0 systemd-rc-local-generator[68895]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:56:46 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 05 05:56:47 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 05 05:56:47 compute-0 sudo[68870]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:47 compute-0 sudo[69062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itgsblrbsykneoafxyzkpivxebiulryd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914207.1410215-440-83957882954612/AnsiballZ_command.py'
Dec 05 05:56:47 compute-0 sudo[69062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:47 compute-0 python3.9[69064]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:56:47 compute-0 sudo[69062]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:48 compute-0 sudo[69215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agjfbgebyforqnvwxuldceqrhcxfnftj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914207.9024327-468-275195681862108/AnsiballZ_stat.py'
Dec 05 05:56:48 compute-0 sudo[69215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:48 compute-0 python3.9[69217]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:48 compute-0 sudo[69215]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:48 compute-0 sudo[69340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptajzztpivpkcujafifzxjgsjfostnsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914207.9024327-468-275195681862108/AnsiballZ_copy.py'
Dec 05 05:56:48 compute-0 sudo[69340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:48 compute-0 python3.9[69342]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914207.9024327-468-275195681862108/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:48 compute-0 sudo[69340]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:48 compute-0 sudo[69493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrfhqttqizreslgcmtnnwxqvvqokgzjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914208.766891-498-104736028885678/AnsiballZ_systemd.py'
Dec 05 05:56:48 compute-0 sudo[69493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:49 compute-0 python3.9[69495]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:56:49 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 05 05:56:49 compute-0 sshd[962]: Received SIGHUP; restarting.
Dec 05 05:56:49 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 05 05:56:49 compute-0 sshd[962]: Server listening on 0.0.0.0 port 22.
Dec 05 05:56:49 compute-0 sshd[962]: Server listening on :: port 22.
Dec 05 05:56:49 compute-0 sudo[69493]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:49 compute-0 sudo[69649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdudfcylquamfbbikbkxaqpiwvvrcvqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914209.3818023-514-178712866997516/AnsiballZ_file.py'
Dec 05 05:56:49 compute-0 sudo[69649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:49 compute-0 python3.9[69651]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:49 compute-0 sudo[69649]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:49 compute-0 sudo[69801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjlipujgyekcdeuceqpjindersscdibi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914209.8348753-530-138929640693892/AnsiballZ_stat.py'
Dec 05 05:56:50 compute-0 sudo[69801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:50 compute-0 python3.9[69803]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:50 compute-0 sudo[69801]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:50 compute-0 sudo[69924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhguebcbbwnmgfaygvgaapsmhxedojxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914209.8348753-530-138929640693892/AnsiballZ_copy.py'
Dec 05 05:56:50 compute-0 sudo[69924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:50 compute-0 python3.9[69926]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914209.8348753-530-138929640693892/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:50 compute-0 sudo[69924]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:51 compute-0 sudo[70076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xltmvbuwivaxbpgtardtjnawfxtacchn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914210.7441115-566-265015083265787/AnsiballZ_timezone.py'
Dec 05 05:56:51 compute-0 sudo[70076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:51 compute-0 python3.9[70078]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 05 05:56:51 compute-0 systemd[1]: Starting Time & Date Service...
Dec 05 05:56:51 compute-0 systemd[1]: Started Time & Date Service.
Dec 05 05:56:51 compute-0 sudo[70076]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:51 compute-0 sudo[70232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivcwygebhctwmnlibkkmcecukeyzitit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914211.4487412-584-268646350492093/AnsiballZ_file.py'
Dec 05 05:56:51 compute-0 sudo[70232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:51 compute-0 python3.9[70234]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:51 compute-0 sudo[70232]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:52 compute-0 sudo[70384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysadhxwpvcpgujrbruyckywtwwzhjtwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914211.8890622-600-249327352297093/AnsiballZ_stat.py'
Dec 05 05:56:52 compute-0 sudo[70384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:52 compute-0 python3.9[70386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:52 compute-0 sudo[70384]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:52 compute-0 sudo[70507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycedxlbubatqlquljfsakqnftiaermpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914211.8890622-600-249327352297093/AnsiballZ_copy.py'
Dec 05 05:56:52 compute-0 sudo[70507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:52 compute-0 python3.9[70509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914211.8890622-600-249327352297093/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:52 compute-0 sudo[70507]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:52 compute-0 sudo[70659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slgnyodhqipaaurkdradvsjhqehlxxpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914212.717119-630-34770347269007/AnsiballZ_stat.py'
Dec 05 05:56:52 compute-0 sudo[70659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:53 compute-0 python3.9[70661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:53 compute-0 sudo[70659]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:53 compute-0 sudo[70782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bystpjdcgexaqixhnpdsnbztzjpkerlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914212.717119-630-34770347269007/AnsiballZ_copy.py'
Dec 05 05:56:53 compute-0 sudo[70782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:53 compute-0 python3.9[70784]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914212.717119-630-34770347269007/.source.yaml _original_basename=.qcphcrbj follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:53 compute-0 sudo[70782]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:53 compute-0 sudo[70934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibneasakfpnamxqxoybavawucvsrigmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914213.5343585-660-183350423988782/AnsiballZ_stat.py'
Dec 05 05:56:53 compute-0 sudo[70934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:53 compute-0 python3.9[70936]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:53 compute-0 sudo[70934]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:54 compute-0 sudo[71057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddzaulxktgzeodmnpuzyuapvqkpppnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914213.5343585-660-183350423988782/AnsiballZ_copy.py'
Dec 05 05:56:54 compute-0 sudo[71057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:54 compute-0 python3.9[71059]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914213.5343585-660-183350423988782/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:54 compute-0 sudo[71057]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:54 compute-0 sudo[71209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zncsrxfnuzdqynmxgxicmowwmmzdscmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914214.3426692-690-186942638015230/AnsiballZ_command.py'
Dec 05 05:56:54 compute-0 sudo[71209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:54 compute-0 python3.9[71211]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:56:54 compute-0 sudo[71209]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:54 compute-0 sudo[71362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vroojcubiaxfacolkwaokipbzvjpxxjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914214.7852736-706-181988368078553/AnsiballZ_command.py'
Dec 05 05:56:54 compute-0 sudo[71362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:55 compute-0 python3.9[71364]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:56:55 compute-0 sudo[71362]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:55 compute-0 sudo[71515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcpkbiepjmwjwhknxcgcfzixxwtaaiuq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914215.2297242-722-252975264124065/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 05:56:55 compute-0 sudo[71515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:55 compute-0 python3[71517]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 05:56:55 compute-0 sudo[71515]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:56 compute-0 sudo[71667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayovlofzyokowdsmxnxepofgulvvkvcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914215.8147435-738-245642810471475/AnsiballZ_stat.py'
Dec 05 05:56:56 compute-0 sudo[71667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:56 compute-0 python3.9[71669]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:56 compute-0 sudo[71667]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:56 compute-0 sudo[71790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lncdnmzzsccongomvispyhacllbegcvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914215.8147435-738-245642810471475/AnsiballZ_copy.py'
Dec 05 05:56:56 compute-0 sudo[71790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:56 compute-0 python3.9[71792]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914215.8147435-738-245642810471475/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:56 compute-0 sudo[71790]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:56 compute-0 sudo[71942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnucyrkdajfporctralupdsuezrldxxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914216.6800795-768-15409108921210/AnsiballZ_stat.py'
Dec 05 05:56:56 compute-0 sudo[71942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:57 compute-0 python3.9[71944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:57 compute-0 sudo[71942]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:57 compute-0 sudo[72065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvezmjnjpdvhxgzqdesfixsfmemheugi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914216.6800795-768-15409108921210/AnsiballZ_copy.py'
Dec 05 05:56:57 compute-0 sudo[72065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:57 compute-0 python3.9[72067]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914216.6800795-768-15409108921210/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:57 compute-0 sudo[72065]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:57 compute-0 sudo[72217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxujxyqflnixfdmakwqhexslbrxvhywk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914217.5636103-798-272837035033664/AnsiballZ_stat.py'
Dec 05 05:56:57 compute-0 sudo[72217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:57 compute-0 python3.9[72219]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:57 compute-0 sudo[72217]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:58 compute-0 sudo[72340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpfqqimpbqctxamtljkmnuztrflngjar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914217.5636103-798-272837035033664/AnsiballZ_copy.py'
Dec 05 05:56:58 compute-0 sudo[72340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:58 compute-0 python3.9[72342]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914217.5636103-798-272837035033664/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:58 compute-0 sudo[72340]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:58 compute-0 sudo[72492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkqpugvpzquqioobjmipolxrxwaazefq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914218.4123907-828-229360133914187/AnsiballZ_stat.py'
Dec 05 05:56:58 compute-0 sudo[72492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:58 compute-0 python3.9[72494]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:58 compute-0 sudo[72492]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:58 compute-0 sudo[72615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whrmvqzuonhqzszipbkpfepkqkjpzzxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914218.4123907-828-229360133914187/AnsiballZ_copy.py'
Dec 05 05:56:58 compute-0 sudo[72615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:59 compute-0 python3.9[72617]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914218.4123907-828-229360133914187/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:56:59 compute-0 sudo[72615]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:59 compute-0 sudo[72767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iayicegzjlooatbcvyhkyvxwwusmaozj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914219.2582684-858-104856238207800/AnsiballZ_stat.py'
Dec 05 05:56:59 compute-0 sudo[72767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:56:59 compute-0 python3.9[72769]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:56:59 compute-0 sudo[72767]: pam_unix(sudo:session): session closed for user root
Dec 05 05:56:59 compute-0 sudo[72890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ustqoepzgdjjifublckmtdmateqjcyby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914219.2582684-858-104856238207800/AnsiballZ_copy.py'
Dec 05 05:56:59 compute-0 sudo[72890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:00 compute-0 python3.9[72892]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914219.2582684-858-104856238207800/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:00 compute-0 sudo[72890]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:00 compute-0 sudo[73042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffgkbnepssizovlsrwucevselqbhyonn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914220.1298008-888-53978219927523/AnsiballZ_file.py'
Dec 05 05:57:00 compute-0 sudo[73042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:00 compute-0 python3.9[73044]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:00 compute-0 sudo[73042]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:00 compute-0 sudo[73194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beccegthxmrhsknsmgdjrisvnpdqjqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914220.5953124-904-131457065498304/AnsiballZ_command.py'
Dec 05 05:57:00 compute-0 sudo[73194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:00 compute-0 python3.9[73196]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:57:00 compute-0 sudo[73194]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:01 compute-0 sudo[73353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykeushywtjvbnhdyciiefbjydhltzqey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914221.0759344-920-188868139104903/AnsiballZ_blockinfile.py'
Dec 05 05:57:01 compute-0 sudo[73353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:01 compute-0 python3.9[73355]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:01 compute-0 sudo[73353]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:01 compute-0 sudo[73506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwgxvkxkawsoyyciipjnczutrpsifiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914221.6914678-938-220577646089369/AnsiballZ_file.py'
Dec 05 05:57:01 compute-0 sudo[73506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:02 compute-0 python3.9[73508]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:02 compute-0 sudo[73506]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:02 compute-0 sudo[73658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-litnprkitthzunrpoeekxmjmfnzbjzmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914222.1116827-938-162932831527589/AnsiballZ_file.py'
Dec 05 05:57:02 compute-0 sudo[73658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:02 compute-0 python3.9[73660]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:02 compute-0 sudo[73658]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:02 compute-0 sudo[73810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-escrkzqjxkzfhmpguvivxgtqimliiukp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914222.5672176-968-93053919259369/AnsiballZ_mount.py'
Dec 05 05:57:02 compute-0 sudo[73810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:03 compute-0 python3.9[73812]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 05:57:03 compute-0 sudo[73810]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:03 compute-0 sudo[73963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hefcdhdtxyovmzaefpwrckcwxpozkjhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914223.1506324-968-135354942962396/AnsiballZ_mount.py'
Dec 05 05:57:03 compute-0 sudo[73963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:03 compute-0 python3.9[73965]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 05 05:57:03 compute-0 sudo[73963]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:03 compute-0 sshd-session[64807]: Connection closed by 192.168.122.30 port 38042
Dec 05 05:57:03 compute-0 sshd-session[64804]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:57:03 compute-0 systemd-logind[745]: Session 14 logged out. Waiting for processes to exit.
Dec 05 05:57:03 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 05 05:57:03 compute-0 systemd[1]: session-14.scope: Consumed 23.966s CPU time.
Dec 05 05:57:03 compute-0 systemd-logind[745]: Removed session 14.
Dec 05 05:57:09 compute-0 sshd-session[73991]: Accepted publickey for zuul from 192.168.122.30 port 44114 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:57:09 compute-0 systemd-logind[745]: New session 15 of user zuul.
Dec 05 05:57:09 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 05 05:57:09 compute-0 sshd-session[73991]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:57:09 compute-0 sudo[74144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqsmvasykdjeiwusxzpnbtfymxuafcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914229.3938117-17-65043437773212/AnsiballZ_tempfile.py'
Dec 05 05:57:09 compute-0 sudo[74144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:09 compute-0 python3.9[74146]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 05 05:57:09 compute-0 sudo[74144]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:09 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 05:57:09 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 05:57:10 compute-0 sudo[74297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikdotjdrquuxhzcstfccsmzptfwcjpgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914229.9782748-41-275475646038575/AnsiballZ_stat.py'
Dec 05 05:57:10 compute-0 sudo[74297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:10 compute-0 python3.9[74299]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:57:10 compute-0 sudo[74297]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:10 compute-0 sudo[74449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpfajxsfboqsgqjjzblpgvojrhtjhjce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914230.5659308-61-158762618229205/AnsiballZ_setup.py'
Dec 05 05:57:10 compute-0 sudo[74449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:11 compute-0 python3.9[74451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:57:11 compute-0 sudo[74449]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:11 compute-0 sudo[74601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fljnxvqupaezcpfendvnfdmhaweglkwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914231.3942347-78-46698210840887/AnsiballZ_blockinfile.py'
Dec 05 05:57:11 compute-0 sudo[74601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:11 compute-0 python3.9[74603]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCccgWMTtAxYqQxSKzPW5EH+CRtkj0gb23vRu9CMOa2M9nrvmODZkvZ4+N6cJIueaOMwcCxPUrm+cr8783QH1n4OIzCB4rLgmlPoPluLIh81FwOMfhr0rPuzoDWP174fOthIWi+BksXdipFhD6rp1/LqqTdp06PdnsaczAyRtRNz6KHH1oFSVXG7aj4HIwdvtgLVttFB4VPMn6rx0jouWdATigZlWq6URKYyiI2ZepwrYC2Wy0/rMKgkk6Bj1CVOvj1JW+Z7t/Xo7rytnzaaN494eI+z/3Y8XTha5bE+PWPONMo4NL1zEMBvTlMRjgaG4NM8XW6iXKc2+ioZndCa99bhHwqv8JXN87NIU1OVPbWAvFYaZRWOq9K3k3u3K+/bW0Dyyiwk3wmbo69nmplNW7A/MN0mfClPrmaHZlyoc/EgP/jmq1IthwpE/DMAfk7NwTE8notBr48oCUCEiDaaCs+c1JIwDKiwtA9n6s2Dv5bf312bJzpWpc1lu1jPzl9XHc=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHp4IlF4CPY3hvFjuTOcL5liXepinRaHKxpxw14TYILq
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAQsAwpXZVgVJIMNmTPPENAHF2pXMnNKAEXPv5SwvwjTQOg3lZBK2pZN/swRhwYipEaLunIjNYccqjXHLQwe+sQ=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDZLu5FX/1yLxTpdsOxDvbIxhpDCz44wzTXcZQdG3a74ji+6LiBCVhTt+orFg3r96DdEfbLvVQIe/Qa/NMd/u1XN+iYj6QNn2MPWtypzKqyfEeDnKwaAXiV1JQ1PRRXi3RZm951qqN5EsMeMzi6HUSwLOs1mRrRngAGRk+qa4WEe83g2gJJxbCssuf/k6fogpbQhJNdu+LyE2lOYBfFBqTNabjk9+3FlquGw6gs+r3S2NlqQsZIKhzfN8Lf7DHkbjutBlmDIBhr+fXEv+sAaJucensLuWv0Wr4aWpYM7HXwfOTLqDCovvmfFOqxDOw/55lch+LzKKQq1uToBenHRsnnefJEg9f++H85ajXxQABznSVaGPZF3o1pkDohYxjhLeLgNhU7NbIgu797cimtvACWcYMP9MTHw/27E3BqFAOCLhtMZ1unVBhspdR7Jv3A8oqtIfB7Cgd6aS95U2XMRZ8a/N4ysT3599cvaY17g30vZARzoLZ0BMlXFaFyLPpUQ6E=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJ1p8+ikWKPDKLB7xg08a+Uo5RD/Fcz0lI0c0D01nyUF
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNXOyHG8K8GTjIConOxkdqjhJyUXVVx8Ipl+pPOkn7WmRP79XuhJKXjWBPn7Y3KIXNzJdUY2urSAje7Amc439d4=
                                             create=True mode=0644 path=/tmp/ansible.hx0tqr_y state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:11 compute-0 sudo[74601]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:12 compute-0 sudo[74753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcvdszkrirdrjxoklyybijvimnvbxxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914231.9694695-94-175682961965735/AnsiballZ_command.py'
Dec 05 05:57:12 compute-0 sudo[74753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:12 compute-0 python3.9[74755]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.hx0tqr_y' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:57:12 compute-0 sudo[74753]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:12 compute-0 sudo[74907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsaarprfqljclsawuimhpbrikcbsouxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914232.5281124-110-279691698957833/AnsiballZ_file.py'
Dec 05 05:57:12 compute-0 sudo[74907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:12 compute-0 python3.9[74909]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.hx0tqr_y state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:12 compute-0 sudo[74907]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:13 compute-0 sshd-session[73994]: Connection closed by 192.168.122.30 port 44114
Dec 05 05:57:13 compute-0 sshd-session[73991]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:57:13 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 05 05:57:13 compute-0 systemd[1]: session-15.scope: Consumed 2.363s CPU time.
Dec 05 05:57:13 compute-0 systemd-logind[745]: Session 15 logged out. Waiting for processes to exit.
Dec 05 05:57:13 compute-0 systemd-logind[745]: Removed session 15.
Dec 05 05:57:18 compute-0 sshd-session[74934]: Accepted publickey for zuul from 192.168.122.30 port 54836 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:57:18 compute-0 systemd-logind[745]: New session 16 of user zuul.
Dec 05 05:57:18 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 05 05:57:18 compute-0 sshd-session[74934]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:57:18 compute-0 python3.9[75087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:57:19 compute-0 sudo[75241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abfkrwarsrshdexmqsninwlngkjqsgrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914239.2132764-44-222343785833086/AnsiballZ_systemd.py'
Dec 05 05:57:19 compute-0 sudo[75241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:19 compute-0 python3.9[75243]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 05:57:19 compute-0 sudo[75241]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:20 compute-0 sudo[75395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etdduccwdmfnziodhegmdzuaqawsmlpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914240.054183-60-48736530340057/AnsiballZ_systemd.py'
Dec 05 05:57:20 compute-0 sudo[75395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:20 compute-0 python3.9[75397]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 05:57:20 compute-0 sudo[75395]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:20 compute-0 sudo[75548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oszvotujkkkemhxpclnunlxkurbnadpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914240.6906993-78-105163011328911/AnsiballZ_command.py'
Dec 05 05:57:20 compute-0 sudo[75548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:21 compute-0 python3.9[75550]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:57:21 compute-0 sudo[75548]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:21 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 05 05:57:21 compute-0 sudo[75703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdmxiirmfuolhaexlzcwivzenrsqrcsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914241.2813902-94-238445265482975/AnsiballZ_stat.py'
Dec 05 05:57:21 compute-0 sudo[75703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:21 compute-0 python3.9[75705]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:57:21 compute-0 sudo[75703]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:22 compute-0 sudo[75857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoelwexoxvpnbxmxxvqnanvfosbogyji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914241.8422253-110-172590792111610/AnsiballZ_command.py'
Dec 05 05:57:22 compute-0 sudo[75857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:22 compute-0 python3.9[75859]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:57:22 compute-0 sudo[75857]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:22 compute-0 sudo[76012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkhjbanpwqasscryfxsejkzrccoiohtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914242.302883-126-199491844885015/AnsiballZ_file.py'
Dec 05 05:57:22 compute-0 sudo[76012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:22 compute-0 python3.9[76014]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:22 compute-0 sudo[76012]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:22 compute-0 sshd-session[74937]: Connection closed by 192.168.122.30 port 54836
Dec 05 05:57:22 compute-0 sshd-session[74934]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:57:23 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 05 05:57:23 compute-0 systemd[1]: session-16.scope: Consumed 3.310s CPU time.
Dec 05 05:57:23 compute-0 systemd-logind[745]: Session 16 logged out. Waiting for processes to exit.
Dec 05 05:57:23 compute-0 systemd-logind[745]: Removed session 16.
Dec 05 05:57:28 compute-0 sshd-session[76039]: Accepted publickey for zuul from 192.168.122.30 port 59130 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:57:28 compute-0 systemd-logind[745]: New session 17 of user zuul.
Dec 05 05:57:28 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 05 05:57:28 compute-0 sshd-session[76039]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:57:29 compute-0 python3.9[76192]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:57:29 compute-0 sudo[76346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivldjdzyfaoqmucdxvqsxkypfzjvjcks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914249.378297-48-169206472329005/AnsiballZ_setup.py'
Dec 05 05:57:29 compute-0 sudo[76346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:29 compute-0 python3.9[76348]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:57:29 compute-0 sudo[76346]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:30 compute-0 sudo[76430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iavqvzcmqxnjhqzqkemprjhqnkbznqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914249.378297-48-169206472329005/AnsiballZ_dnf.py'
Dec 05 05:57:30 compute-0 sudo[76430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:30 compute-0 python3.9[76432]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 05 05:57:31 compute-0 sudo[76430]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:31 compute-0 python3.9[76583]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:57:32 compute-0 python3.9[76734]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 05:57:33 compute-0 python3.9[76884]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:57:33 compute-0 python3.9[77034]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:57:34 compute-0 sshd-session[76042]: Connection closed by 192.168.122.30 port 59130
Dec 05 05:57:34 compute-0 sshd-session[76039]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:57:34 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 05 05:57:34 compute-0 systemd[1]: session-17.scope: Consumed 4.124s CPU time.
Dec 05 05:57:34 compute-0 systemd-logind[745]: Session 17 logged out. Waiting for processes to exit.
Dec 05 05:57:34 compute-0 systemd-logind[745]: Removed session 17.
Dec 05 05:57:39 compute-0 sshd-session[77059]: Accepted publickey for zuul from 192.168.122.30 port 60878 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:57:39 compute-0 systemd-logind[745]: New session 18 of user zuul.
Dec 05 05:57:39 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 05 05:57:39 compute-0 sshd-session[77059]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:57:40 compute-0 python3.9[77212]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:57:41 compute-0 sudo[77366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdzilrndedpyyfemccpspygngrdrrbeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914261.044444-80-169392782627613/AnsiballZ_file.py'
Dec 05 05:57:41 compute-0 sudo[77366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:41 compute-0 python3.9[77368]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:41 compute-0 sudo[77366]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:41 compute-0 sudo[77518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvflnodtxxjufvwnatcfbcvpesxhpony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914261.6052637-80-234233762575865/AnsiballZ_file.py'
Dec 05 05:57:41 compute-0 sudo[77518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:41 compute-0 python3.9[77520]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:41 compute-0 sudo[77518]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:42 compute-0 sudo[77670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isckidrxwmkeitfsfujutetjqoxtjasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914262.0645003-111-54074491952437/AnsiballZ_stat.py'
Dec 05 05:57:42 compute-0 sudo[77670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:42 compute-0 python3.9[77672]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:42 compute-0 sudo[77670]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:42 compute-0 sudo[77793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrpiaugdsvycoukzhdacuhbsgbtiforg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914262.0645003-111-54074491952437/AnsiballZ_copy.py'
Dec 05 05:57:42 compute-0 sudo[77793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:43 compute-0 python3.9[77795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914262.0645003-111-54074491952437/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=3a3ee39e5902c16dd99d2fd8db9237255ab264a3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:43 compute-0 sudo[77793]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:43 compute-0 sudo[77945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suaopgpwdufyaabebdwziihmqdqooqbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914263.1337829-111-140394066184582/AnsiballZ_stat.py'
Dec 05 05:57:43 compute-0 sudo[77945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:43 compute-0 python3.9[77947]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:43 compute-0 sudo[77945]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:43 compute-0 sudo[78068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxaopyogzaowydxrxcukymuzsgjrwraf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914263.1337829-111-140394066184582/AnsiballZ_copy.py'
Dec 05 05:57:43 compute-0 sudo[78068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:43 compute-0 python3.9[78070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914263.1337829-111-140394066184582/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=538a415c688fb5cc783a999c2914201a6901786a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:43 compute-0 sudo[78068]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:44 compute-0 sudo[78220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itkrllrpaxdsueztaynyopzpstbfcgik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914263.9297743-111-148210562289193/AnsiballZ_stat.py'
Dec 05 05:57:44 compute-0 sudo[78220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:44 compute-0 python3.9[78222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:44 compute-0 sudo[78220]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:44 compute-0 sudo[78343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvyingxpmgfixwylvjqxafcfcfyiomkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914263.9297743-111-148210562289193/AnsiballZ_copy.py'
Dec 05 05:57:44 compute-0 sudo[78343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:44 compute-0 python3.9[78345]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914263.9297743-111-148210562289193/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=af4035dd0dc48722463e3a18ee707f6d664b35ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:44 compute-0 sudo[78343]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:44 compute-0 sudo[78495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrkjaojulpzdavaeqlqpwhaoqrpffyzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914264.762126-198-8232105024097/AnsiballZ_file.py'
Dec 05 05:57:44 compute-0 sudo[78495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:45 compute-0 python3.9[78497]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:45 compute-0 sudo[78495]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:45 compute-0 sudo[78647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpauodmxtcgvkqjxlzjxkylacpzgijkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914265.1675231-198-167152992482484/AnsiballZ_file.py'
Dec 05 05:57:45 compute-0 sudo[78647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:45 compute-0 python3.9[78649]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:45 compute-0 sudo[78647]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:45 compute-0 sudo[78799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgihpxqfkyjaryjsmyhsdwvzhvyylmlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914265.6022651-228-158221992319797/AnsiballZ_stat.py'
Dec 05 05:57:45 compute-0 sudo[78799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:45 compute-0 python3.9[78801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:45 compute-0 sudo[78799]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:46 compute-0 sudo[78922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdqgxsdpjhjegndtrolqacbyybktdwin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914265.6022651-228-158221992319797/AnsiballZ_copy.py'
Dec 05 05:57:46 compute-0 sudo[78922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:46 compute-0 python3.9[78924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914265.6022651-228-158221992319797/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=78a6d94fab21bd5b1793cc85f3cbfbfd4e94b5dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:46 compute-0 sudo[78922]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:46 compute-0 sudo[79074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxulikhphlgwwrqjnicnjxkuffclgcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914266.3779173-228-73261535066565/AnsiballZ_stat.py'
Dec 05 05:57:46 compute-0 sudo[79074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:46 compute-0 python3.9[79076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:46 compute-0 sudo[79074]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:46 compute-0 sudo[79197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qljotolqteqvqttfcjezlecexvhwwteo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914266.3779173-228-73261535066565/AnsiballZ_copy.py'
Dec 05 05:57:46 compute-0 sudo[79197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:47 compute-0 python3.9[79199]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914266.3779173-228-73261535066565/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=edabfafba2dfcf82f160f29d481f5f174cadee62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:47 compute-0 sudo[79197]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:47 compute-0 sudo[79349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skwjumjbiksxdzjlwzcvpmljorlxzguw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914267.1623871-228-203904248826009/AnsiballZ_stat.py'
Dec 05 05:57:47 compute-0 sudo[79349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:47 compute-0 python3.9[79351]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:47 compute-0 sudo[79349]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:47 compute-0 sudo[79472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otlhfvzdjlfixraqrkvdfokqdjuwvgeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914267.1623871-228-203904248826009/AnsiballZ_copy.py'
Dec 05 05:57:47 compute-0 sudo[79472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:47 compute-0 python3.9[79474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914267.1623871-228-203904248826009/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3e29479b7f322cbfa74ebc229eb203da0f0d36bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:47 compute-0 sudo[79472]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:48 compute-0 sudo[79624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzfjlstcbufryvqcycefejyeczwjqaag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914268.0371513-318-162267038267934/AnsiballZ_file.py'
Dec 05 05:57:48 compute-0 sudo[79624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:48 compute-0 python3.9[79626]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:48 compute-0 sudo[79624]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:48 compute-0 sudo[79776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfxyitscfxuiwvyaofbepfpgnyusauv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914268.4839835-318-74047726765160/AnsiballZ_file.py'
Dec 05 05:57:48 compute-0 sudo[79776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:48 compute-0 python3.9[79778]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:48 compute-0 sudo[79776]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:49 compute-0 sudo[79928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxueqaksgjdgitadytyslyaobdgcinns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914268.958165-348-74514775540973/AnsiballZ_stat.py'
Dec 05 05:57:49 compute-0 sudo[79928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:49 compute-0 python3.9[79930]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:49 compute-0 sudo[79928]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:49 compute-0 sudo[80051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtctevknxplymhkbsscztqepbuekudtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914268.958165-348-74514775540973/AnsiballZ_copy.py'
Dec 05 05:57:49 compute-0 sudo[80051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:49 compute-0 python3.9[80053]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914268.958165-348-74514775540973/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=92595a713bab2af975c931398034b0df296441a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:49 compute-0 sudo[80051]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:49 compute-0 sudo[80203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asbeohjdyqobpnnuimcbkjxduuxjqstg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914269.7607822-348-189523295790426/AnsiballZ_stat.py'
Dec 05 05:57:49 compute-0 sudo[80203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:50 compute-0 python3.9[80205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:50 compute-0 sudo[80203]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:50 compute-0 sudo[80326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onxialqrmeyxmjmgakbvvuldtemvtmhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914269.7607822-348-189523295790426/AnsiballZ_copy.py'
Dec 05 05:57:50 compute-0 sudo[80326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:50 compute-0 python3.9[80328]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914269.7607822-348-189523295790426/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c31905d541e9cb4f9786bbb8d6da34c0f2f2c36b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:50 compute-0 sudo[80326]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:50 compute-0 sudo[80478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-extcmfcwarbdriwdtotbdiqjxxpvlhkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914270.5576413-348-140813495825680/AnsiballZ_stat.py'
Dec 05 05:57:50 compute-0 sudo[80478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:50 compute-0 python3.9[80480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:50 compute-0 sudo[80478]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:51 compute-0 sudo[80601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnyxcpfxynkeeolnjscjiulqsemvtnrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914270.5576413-348-140813495825680/AnsiballZ_copy.py'
Dec 05 05:57:51 compute-0 sudo[80601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:51 compute-0 python3.9[80603]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914270.5576413-348-140813495825680/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=de200ac2c1ccebda144d524a7aed7cebd3e1279a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:51 compute-0 sudo[80601]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:51 compute-0 sudo[80753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amykqswihesndbrzxglmquhtwlemzeev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914271.3928106-439-94958308956189/AnsiballZ_file.py'
Dec 05 05:57:51 compute-0 sudo[80753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:51 compute-0 python3.9[80755]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:51 compute-0 sudo[80753]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:51 compute-0 sudo[80905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeypitbievffjnivaurfomrxzzubtoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914271.8090956-439-110579850743821/AnsiballZ_file.py'
Dec 05 05:57:51 compute-0 sudo[80905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:52 compute-0 python3.9[80907]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:52 compute-0 sudo[80905]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:52 compute-0 sudo[81057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqwqllweggzdyojdqwkikdypbfzcrjnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914272.2559419-467-184927554098072/AnsiballZ_stat.py'
Dec 05 05:57:52 compute-0 sudo[81057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:52 compute-0 python3.9[81059]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:52 compute-0 sudo[81057]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:52 compute-0 sudo[81180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dypsherwfkqzzvtexrbevarhyihiecwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914272.2559419-467-184927554098072/AnsiballZ_copy.py'
Dec 05 05:57:52 compute-0 sudo[81180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:52 compute-0 python3.9[81182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914272.2559419-467-184927554098072/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=36c6e8bfd9182bf8c50bd188b8c0f4d90404720d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:52 compute-0 sudo[81180]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:53 compute-0 sudo[81332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwmgjxecyankxkbqztampqmsuvdgqaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914273.027777-467-80898069511884/AnsiballZ_stat.py'
Dec 05 05:57:53 compute-0 sudo[81332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:53 compute-0 python3.9[81334]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:53 compute-0 sudo[81332]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:53 compute-0 sudo[81455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aevqwnneyxaoddnxnygtyhanrstabkki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914273.027777-467-80898069511884/AnsiballZ_copy.py'
Dec 05 05:57:53 compute-0 sudo[81455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:53 compute-0 python3.9[81457]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914273.027777-467-80898069511884/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c31905d541e9cb4f9786bbb8d6da34c0f2f2c36b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:53 compute-0 sudo[81455]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:53 compute-0 sudo[81607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkowkxyfruwmcmqqhybqqimotxjlpgvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914273.8109074-467-162107384630411/AnsiballZ_stat.py'
Dec 05 05:57:53 compute-0 sudo[81607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:54 compute-0 python3.9[81609]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:54 compute-0 sudo[81607]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:54 compute-0 sudo[81730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnjfqrjpisbzlpaokytwqahievijtqgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914273.8109074-467-162107384630411/AnsiballZ_copy.py'
Dec 05 05:57:54 compute-0 sudo[81730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:54 compute-0 python3.9[81732]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914273.8109074-467-162107384630411/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=4b9541d5dcff60d9410b3884c3bcc49f90d0dbe3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:54 compute-0 sudo[81730]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:55 compute-0 sudo[81882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okvqgdxjkiukqhedleaafqoopprjsbhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914275.0725899-584-116865983292000/AnsiballZ_file.py'
Dec 05 05:57:55 compute-0 sudo[81882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:55 compute-0 python3.9[81884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:55 compute-0 sudo[81882]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:55 compute-0 sudo[82034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuzitayeqtlzlzmtmvisupzfmuuffubx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914275.5309296-600-275471482306631/AnsiballZ_stat.py'
Dec 05 05:57:55 compute-0 sudo[82034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:55 compute-0 python3.9[82036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:55 compute-0 sudo[82034]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:56 compute-0 sudo[82157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcrainscohtwwauxvexnthwqhirsumbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914275.5309296-600-275471482306631/AnsiballZ_copy.py'
Dec 05 05:57:56 compute-0 sudo[82157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:56 compute-0 python3.9[82159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914275.5309296-600-275471482306631/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:56 compute-0 sudo[82157]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:56 compute-0 sudo[82309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlxthpkxwpyfiyfsqehwrwuimqpiqhmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914276.396087-632-71037372116311/AnsiballZ_file.py'
Dec 05 05:57:56 compute-0 sudo[82309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:56 compute-0 python3.9[82311]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:56 compute-0 sudo[82309]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:57 compute-0 sudo[82461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjonzvdeujofzdmyqdekzalwhfpmpcnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914276.852375-648-215027644209122/AnsiballZ_stat.py'
Dec 05 05:57:57 compute-0 sudo[82461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:57 compute-0 python3.9[82463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:57 compute-0 sudo[82461]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:57 compute-0 sudo[82584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnehjccrguxaplrjxwelewolbmlrkvup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914276.852375-648-215027644209122/AnsiballZ_copy.py'
Dec 05 05:57:57 compute-0 sudo[82584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:57 compute-0 python3.9[82586]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914276.852375-648-215027644209122/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:57 compute-0 sudo[82584]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:57 compute-0 sudo[82736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vescmyomegvhnqklwapwuicnxesxwjay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914277.7250183-680-193427844058458/AnsiballZ_file.py'
Dec 05 05:57:57 compute-0 sudo[82736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:58 compute-0 python3.9[82738]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:58 compute-0 sudo[82736]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:58 compute-0 sudo[82888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akaqmgdzxehikjwbylwxkayjjhpvdtgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914278.1849337-696-31023312916763/AnsiballZ_stat.py'
Dec 05 05:57:58 compute-0 sudo[82888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:58 compute-0 python3.9[82890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:58 compute-0 sudo[82888]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:58 compute-0 sudo[83011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuonccejapqoxceugbzcblyqhdocrvwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914278.1849337-696-31023312916763/AnsiballZ_copy.py'
Dec 05 05:57:58 compute-0 sudo[83011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:58 compute-0 python3.9[83013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914278.1849337-696-31023312916763/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:57:58 compute-0 sudo[83011]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:59 compute-0 sudo[83163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdwmqnnfyjktzktlqolpjmwmkpbbtgqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914279.0412974-728-185659945754824/AnsiballZ_file.py'
Dec 05 05:57:59 compute-0 sudo[83163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:59 compute-0 python3.9[83165]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:57:59 compute-0 sudo[83163]: pam_unix(sudo:session): session closed for user root
Dec 05 05:57:59 compute-0 sudo[83315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-manetapgncpbceabwbupqchzkmzhryyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914279.5012038-744-50049974920430/AnsiballZ_stat.py'
Dec 05 05:57:59 compute-0 sudo[83315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:57:59 compute-0 python3.9[83317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:57:59 compute-0 sudo[83315]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:00 compute-0 sudo[83438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwakhzcvpvdoheyglndwvmiujvtxobol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914279.5012038-744-50049974920430/AnsiballZ_copy.py'
Dec 05 05:58:00 compute-0 sudo[83438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:00 compute-0 python3.9[83440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914279.5012038-744-50049974920430/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:00 compute-0 sudo[83438]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:00 compute-0 sudo[83590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkqmhsutabiqwnwqhpnehcueysbybhyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914280.3537664-776-181517509322611/AnsiballZ_file.py'
Dec 05 05:58:00 compute-0 sudo[83590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:00 compute-0 python3.9[83592]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:00 compute-0 sudo[83590]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:00 compute-0 sudo[83742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xthobxyisvabqtzwgsdyylijqxlzfsid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914280.8023667-791-140932870742253/AnsiballZ_stat.py'
Dec 05 05:58:00 compute-0 sudo[83742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:01 compute-0 python3.9[83744]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:01 compute-0 sudo[83742]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:01 compute-0 sudo[83865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnrvwvbywojzaxktgonbwzfaejhysyfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914280.8023667-791-140932870742253/AnsiballZ_copy.py'
Dec 05 05:58:01 compute-0 sudo[83865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:01 compute-0 python3.9[83867]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914280.8023667-791-140932870742253/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:01 compute-0 sudo[83865]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:01 compute-0 sudo[84017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvxmkndczlknpotrtipdqjvqoywbyts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914281.6796865-822-174985337880571/AnsiballZ_file.py'
Dec 05 05:58:01 compute-0 sudo[84017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:01 compute-0 python3.9[84019]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:02 compute-0 sudo[84017]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:02 compute-0 sudo[84169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rngtjpumsnmuvpwgboudouhuqzohlybc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914282.1125042-839-74680542743652/AnsiballZ_stat.py'
Dec 05 05:58:02 compute-0 sudo[84169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:02 compute-0 python3.9[84171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:02 compute-0 sudo[84169]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:02 compute-0 sudo[84292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdjccwnrotilysypvdalgznhufcmjzdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914282.1125042-839-74680542743652/AnsiballZ_copy.py'
Dec 05 05:58:02 compute-0 sudo[84292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:02 compute-0 python3.9[84294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914282.1125042-839-74680542743652/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:02 compute-0 sudo[84292]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:03 compute-0 sudo[84444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jizfszjnxavtcgtxhdenrvmosatnhxtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914282.95368-870-230936289814897/AnsiballZ_file.py'
Dec 05 05:58:03 compute-0 sudo[84444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:03 compute-0 python3.9[84446]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:03 compute-0 sudo[84444]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:03 compute-0 sudo[84596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caoanvdajotqqzovfheyhrhqiwgkegyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914283.3912919-887-53335919710793/AnsiballZ_stat.py'
Dec 05 05:58:03 compute-0 sudo[84596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:03 compute-0 python3.9[84598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:03 compute-0 sudo[84596]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:03 compute-0 sudo[84719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmawbjypagamvbthzudmrdqiumfjirfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914283.3912919-887-53335919710793/AnsiballZ_copy.py'
Dec 05 05:58:03 compute-0 sudo[84719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:04 compute-0 python3.9[84721]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914283.3912919-887-53335919710793/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1e190ef277e596f2b1eff87bf5ee2d33d529f85c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:04 compute-0 sudo[84719]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:04 compute-0 sshd-session[77062]: Connection closed by 192.168.122.30 port 60878
Dec 05 05:58:04 compute-0 sshd-session[77059]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:58:04 compute-0 systemd-logind[745]: Session 18 logged out. Waiting for processes to exit.
Dec 05 05:58:04 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 05 05:58:04 compute-0 systemd[1]: session-18.scope: Consumed 19.267s CPU time.
Dec 05 05:58:04 compute-0 systemd-logind[745]: Removed session 18.
Dec 05 05:58:09 compute-0 sshd-session[84746]: Accepted publickey for zuul from 192.168.122.30 port 55230 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:58:09 compute-0 systemd-logind[745]: New session 19 of user zuul.
Dec 05 05:58:09 compute-0 systemd[1]: Started Session 19 of User zuul.
Dec 05 05:58:09 compute-0 sshd-session[84746]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:58:10 compute-0 python3.9[84899]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:58:11 compute-0 sudo[85053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmkblultnqqghyzflkvdqdacumjbpke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914290.926718-48-29771614901784/AnsiballZ_file.py'
Dec 05 05:58:11 compute-0 sudo[85053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:11 compute-0 python3.9[85055]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:11 compute-0 sudo[85053]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:11 compute-0 sudo[85205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omidgejukxlhrgtnmvwwadohwesrnzzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914291.4598815-48-40770524851495/AnsiballZ_file.py'
Dec 05 05:58:11 compute-0 sudo[85205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:11 compute-0 python3.9[85207]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:11 compute-0 sudo[85205]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:12 compute-0 python3.9[85357]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:58:12 compute-0 sudo[85507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwuyntdgxlbpdokjecdfvrlgdghinkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914292.4420905-94-165288269928763/AnsiballZ_seboolean.py'
Dec 05 05:58:12 compute-0 sudo[85507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:12 compute-0 python3.9[85509]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 05:58:13 compute-0 sudo[85507]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:14 compute-0 sudo[85663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohceltprirjmnammyjsyqtvmguueddir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914293.8608768-114-30845566879360/AnsiballZ_setup.py'
Dec 05 05:58:14 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 05 05:58:14 compute-0 sudo[85663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:14 compute-0 python3.9[85665]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:58:14 compute-0 sudo[85663]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:14 compute-0 sudo[85747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqpbhldfuqpykvdwodcrqrkpkqgmuwqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914293.8608768-114-30845566879360/AnsiballZ_dnf.py'
Dec 05 05:58:14 compute-0 sudo[85747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:14 compute-0 python3.9[85749]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:58:15 compute-0 sudo[85747]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:16 compute-0 sudo[85900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxhspusqkslcteorkubooiiessvbeczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914295.9938676-138-155867993565682/AnsiballZ_systemd.py'
Dec 05 05:58:16 compute-0 sudo[85900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:16 compute-0 python3.9[85902]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 05:58:16 compute-0 sudo[85900]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:17 compute-0 sudo[86055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iehixlvptaodjktulvpdejqubehoggza ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914296.816236-154-125091684034923/AnsiballZ_edpm_nftables_snippet.py'
Dec 05 05:58:17 compute-0 sudo[86055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:17 compute-0 python3[86057]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 05 05:58:17 compute-0 sudo[86055]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:17 compute-0 sudo[86207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpjynimgtzihultoygxnpguwfawahyxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914297.4619675-172-100166717751989/AnsiballZ_file.py'
Dec 05 05:58:17 compute-0 sudo[86207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:17 compute-0 python3.9[86209]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:17 compute-0 sudo[86207]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:18 compute-0 sudo[86359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfepqfumwgaebglxrkxneeknxiyhlafg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914297.9086387-188-128098028679484/AnsiballZ_stat.py'
Dec 05 05:58:18 compute-0 sudo[86359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:18 compute-0 python3.9[86361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:18 compute-0 sudo[86359]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:18 compute-0 sudo[86437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqepgcsnlqzgnsnnxtyaogvrdekremzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914297.9086387-188-128098028679484/AnsiballZ_file.py'
Dec 05 05:58:18 compute-0 sudo[86437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:18 compute-0 python3.9[86439]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:18 compute-0 sudo[86437]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:18 compute-0 sudo[86589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdsxtpjqtqkfvcnfcbwxkhhkwfmdpfap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914298.7847874-212-168082155634411/AnsiballZ_stat.py'
Dec 05 05:58:18 compute-0 sudo[86589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:19 compute-0 python3.9[86591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:19 compute-0 sudo[86589]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:19 compute-0 sudo[86667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnyugexrmilemxfdeqymxxovzrztxcxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914298.7847874-212-168082155634411/AnsiballZ_file.py'
Dec 05 05:58:19 compute-0 sudo[86667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:19 compute-0 python3.9[86669]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.s_9tlk6y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:19 compute-0 sudo[86667]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:19 compute-0 sudo[86819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulwxputwredkxcwxkqwpfgttqahfxiwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914299.5778346-236-223919103286521/AnsiballZ_stat.py'
Dec 05 05:58:19 compute-0 sudo[86819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:19 compute-0 python3.9[86821]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:19 compute-0 sudo[86819]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:20 compute-0 sudo[86897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivcoyjfmqyxcwiuhlxchshoelpgsxme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914299.5778346-236-223919103286521/AnsiballZ_file.py'
Dec 05 05:58:20 compute-0 sudo[86897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:20 compute-0 python3.9[86899]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:20 compute-0 sudo[86897]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:20 compute-0 sudo[87049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pboccjijkjvyfjvoscuecocpjjxpqepe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914300.3888538-262-249977783912607/AnsiballZ_command.py'
Dec 05 05:58:20 compute-0 sudo[87049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:20 compute-0 python3.9[87051]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:20 compute-0 sudo[87049]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:21 compute-0 sudo[87202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggvaiqikmlunyrpequzujkdimmggmzlf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914300.9557579-278-127433815736526/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 05:58:21 compute-0 sudo[87202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:21 compute-0 python3[87204]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 05:58:21 compute-0 sudo[87202]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:21 compute-0 sudo[87354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytpyeunifbtuskcwggybkjipnjlcdcvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914301.5239594-294-154319581120683/AnsiballZ_stat.py'
Dec 05 05:58:21 compute-0 sudo[87354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:21 compute-0 python3.9[87356]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:21 compute-0 sudo[87354]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:22 compute-0 sudo[87479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sawsotpphbrqmyicjgwrhuwfiqgbiape ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914301.5239594-294-154319581120683/AnsiballZ_copy.py'
Dec 05 05:58:22 compute-0 sudo[87479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:22 compute-0 python3.9[87481]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914301.5239594-294-154319581120683/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:22 compute-0 sudo[87479]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:22 compute-0 sudo[87631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnflkemealcdzxgcgsxtvooeafbytwab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914302.5033262-324-280228881937955/AnsiballZ_stat.py'
Dec 05 05:58:22 compute-0 sudo[87631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:22 compute-0 python3.9[87633]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:22 compute-0 sudo[87631]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:23 compute-0 sudo[87756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxbatuahmrdqvkseazqmnunoqliyswsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914302.5033262-324-280228881937955/AnsiballZ_copy.py'
Dec 05 05:58:23 compute-0 sudo[87756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:23 compute-0 python3.9[87758]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914302.5033262-324-280228881937955/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:23 compute-0 sudo[87756]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:23 compute-0 sudo[87908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coxnniwfhpwjxsylvujyvoasyqunivbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914303.365814-354-221776046417799/AnsiballZ_stat.py'
Dec 05 05:58:23 compute-0 sudo[87908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:23 compute-0 python3.9[87910]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:23 compute-0 sudo[87908]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:23 compute-0 sudo[88033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgicvyelxkwxsivoldnbbvqipsdbmdsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914303.365814-354-221776046417799/AnsiballZ_copy.py'
Dec 05 05:58:23 compute-0 sudo[88033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:24 compute-0 python3.9[88035]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914303.365814-354-221776046417799/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:24 compute-0 sudo[88033]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:24 compute-0 sudo[88185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poybzcnfrfehthfhmbmokgwbsplftuap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914304.2376504-384-206546954960086/AnsiballZ_stat.py'
Dec 05 05:58:24 compute-0 sudo[88185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:24 compute-0 python3.9[88187]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:24 compute-0 sudo[88185]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:24 compute-0 sudo[88310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riexzhmqhczmueiewhkxglkmhfcdbikj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914304.2376504-384-206546954960086/AnsiballZ_copy.py'
Dec 05 05:58:24 compute-0 sudo[88310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:24 compute-0 python3.9[88312]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914304.2376504-384-206546954960086/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:24 compute-0 sudo[88310]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:25 compute-0 sudo[88462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhhbuxpkfhyqowjepcfpfiokpsdzzalv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914305.0716774-414-114084547720102/AnsiballZ_stat.py'
Dec 05 05:58:25 compute-0 sudo[88462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:25 compute-0 python3.9[88464]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:25 compute-0 sudo[88462]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:25 compute-0 sudo[88587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ampxrfkzxsntxfbrkgnhtyiqerqeemuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914305.0716774-414-114084547720102/AnsiballZ_copy.py'
Dec 05 05:58:25 compute-0 sudo[88587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:25 compute-0 python3.9[88589]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914305.0716774-414-114084547720102/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:25 compute-0 sudo[88587]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:26 compute-0 sudo[88739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iubsnuxehxpvtocbkmqnqgzrdqjieslx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914305.9450555-444-37217689925709/AnsiballZ_file.py'
Dec 05 05:58:26 compute-0 sudo[88739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:26 compute-0 python3.9[88741]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:26 compute-0 sudo[88739]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:26 compute-0 sudo[88891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-latniochwdszmoitpiasjqprkfjpcasx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914306.391452-460-66888163298255/AnsiballZ_command.py'
Dec 05 05:58:26 compute-0 sudo[88891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:26 compute-0 python3.9[88893]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:26 compute-0 sudo[88891]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:27 compute-0 sudo[89046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqkudgfxamljjqtiifcjisnsgsrqfwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914306.8516517-476-112264380052123/AnsiballZ_blockinfile.py'
Dec 05 05:58:27 compute-0 sudo[89046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:27 compute-0 python3.9[89048]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:27 compute-0 sudo[89046]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:27 compute-0 sudo[89198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuqkftmqdxyhxnwbjjyqeaevmoxjnmeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914307.446978-494-276768288154618/AnsiballZ_command.py'
Dec 05 05:58:27 compute-0 sudo[89198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:27 compute-0 python3.9[89200]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:27 compute-0 sudo[89198]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:28 compute-0 sudo[89351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qexdxnzisfzuytjmjdxkzgussurocxsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914307.898381-510-143168696966372/AnsiballZ_stat.py'
Dec 05 05:58:28 compute-0 sudo[89351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:28 compute-0 python3.9[89353]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:58:28 compute-0 sudo[89351]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:28 compute-0 sudo[89505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqjxmwubdxvqdatnqpwknheodftjhqax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914308.33478-526-186160991756942/AnsiballZ_command.py'
Dec 05 05:58:28 compute-0 sudo[89505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:28 compute-0 python3.9[89507]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:28 compute-0 sudo[89505]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:28 compute-0 sudo[89660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnwcwbqqyhperiiinyxuwmxkdxfdvwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914308.7957244-542-140334826334117/AnsiballZ_file.py'
Dec 05 05:58:28 compute-0 sudo[89660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:29 compute-0 python3.9[89662]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:29 compute-0 sudo[89660]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:29 compute-0 python3.9[89812]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:58:30 compute-0 sudo[89963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkzcjsosuicttnoulwvyyovqezilntec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914310.4008234-622-170494607954126/AnsiballZ_command.py'
Dec 05 05:58:30 compute-0 sudo[89963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:30 compute-0 python3.9[89965]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:30 compute-0 ovs-vsctl[89966]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 05 05:58:30 compute-0 sudo[89963]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:31 compute-0 sudo[90116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oawsftttwnbmtzvhnuhzobmkzswsetnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914310.8786714-640-148966723366916/AnsiballZ_command.py'
Dec 05 05:58:31 compute-0 sudo[90116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:31 compute-0 python3.9[90118]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:31 compute-0 sudo[90116]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:31 compute-0 sudo[90271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsjymaxyluqsbsimbjpyahtymeazurhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914311.317801-656-200269394431441/AnsiballZ_command.py'
Dec 05 05:58:31 compute-0 sudo[90271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:31 compute-0 python3.9[90273]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:31 compute-0 ovs-vsctl[90274]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 05 05:58:31 compute-0 sudo[90271]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:32 compute-0 python3.9[90424]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:58:32 compute-0 sudo[90576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zntzvqvxtvmpvlqelvoulwmkewtayujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914312.243521-690-266446033762034/AnsiballZ_file.py'
Dec 05 05:58:32 compute-0 sudo[90576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:32 compute-0 python3.9[90578]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:32 compute-0 sudo[90576]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:32 compute-0 sudo[90728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmpfxciezqixpcwdkxzchqzxrxklvhht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914312.696395-706-156662889116860/AnsiballZ_stat.py'
Dec 05 05:58:32 compute-0 sudo[90728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:33 compute-0 python3.9[90730]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:33 compute-0 sudo[90728]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:33 compute-0 sudo[90806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siafmqlulnmoaunczhetrjecntllnhkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914312.696395-706-156662889116860/AnsiballZ_file.py'
Dec 05 05:58:33 compute-0 sudo[90806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:33 compute-0 python3.9[90808]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:33 compute-0 sudo[90806]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:33 compute-0 sudo[90958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-royvdcplvtyktgkfyazcwlxnhheykqls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914313.4230924-706-144827983685997/AnsiballZ_stat.py'
Dec 05 05:58:33 compute-0 sudo[90958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:33 compute-0 python3.9[90960]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:33 compute-0 sudo[90958]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:33 compute-0 sudo[91036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjzenrareynjlwonzfqjtmngcgrbxtry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914313.4230924-706-144827983685997/AnsiballZ_file.py'
Dec 05 05:58:33 compute-0 sudo[91036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:34 compute-0 python3.9[91038]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:34 compute-0 sudo[91036]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:34 compute-0 sudo[91188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aejwunngoxddqzwfmcavbvhtzcqpqmdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914314.167326-752-88779806978744/AnsiballZ_file.py'
Dec 05 05:58:34 compute-0 sudo[91188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:34 compute-0 python3.9[91190]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:34 compute-0 sudo[91188]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:34 compute-0 sudo[91340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrhqsyvwrhgecmrahjzstrsunilghddq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914314.6223454-768-238135877102153/AnsiballZ_stat.py'
Dec 05 05:58:34 compute-0 sudo[91340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:34 compute-0 python3.9[91342]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:34 compute-0 sudo[91340]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:35 compute-0 sudo[91418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jezcqpixattqubtwenfjwsfkszlqtupa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914314.6223454-768-238135877102153/AnsiballZ_file.py'
Dec 05 05:58:35 compute-0 sudo[91418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:35 compute-0 python3.9[91420]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:35 compute-0 sudo[91418]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:35 compute-0 chronyd[64778]: Selected source 208.113.130.146 (pool.ntp.org)
Dec 05 05:58:35 compute-0 sudo[91570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sryzdhledcwqyxhomwduhqzuhgnxcqsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914315.3645935-792-254683826258709/AnsiballZ_stat.py'
Dec 05 05:58:35 compute-0 sudo[91570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:35 compute-0 python3.9[91572]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:35 compute-0 sudo[91570]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:35 compute-0 sudo[91648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vruyczvsauonydbneloslbottscfudwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914315.3645935-792-254683826258709/AnsiballZ_file.py'
Dec 05 05:58:35 compute-0 sudo[91648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:35 compute-0 python3.9[91650]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:36 compute-0 sudo[91648]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:36 compute-0 sudo[91800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgvavftrdbbhpegqiscmyfrqqvyvtmdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914316.1110413-816-27634027517768/AnsiballZ_systemd.py'
Dec 05 05:58:36 compute-0 sudo[91800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:36 compute-0 python3.9[91802]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:58:36 compute-0 systemd[1]: Reloading.
Dec 05 05:58:36 compute-0 systemd-rc-local-generator[91822]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:58:36 compute-0 systemd-sysv-generator[91826]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:58:36 compute-0 sudo[91800]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:36 compute-0 sudo[91989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwpbavlrcrbkekbfercnjmxhmnmgoxcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914316.826696-832-128463469764326/AnsiballZ_stat.py'
Dec 05 05:58:36 compute-0 sudo[91989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:37 compute-0 python3.9[91991]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:37 compute-0 sudo[91989]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:37 compute-0 sudo[92067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttxwfolmtroieapxrqfgjoltkzybemd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914316.826696-832-128463469764326/AnsiballZ_file.py'
Dec 05 05:58:37 compute-0 sudo[92067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:37 compute-0 python3.9[92069]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:37 compute-0 sudo[92067]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:37 compute-0 sudo[92219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgrekpgmminadkwkkyaqfnklywigkdpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914317.584056-856-124715725088532/AnsiballZ_stat.py'
Dec 05 05:58:37 compute-0 sudo[92219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:37 compute-0 python3.9[92221]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:37 compute-0 sudo[92219]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:38 compute-0 sudo[92297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boqsqczzxmetpvaewrpunxdhnaxnpzfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914317.584056-856-124715725088532/AnsiballZ_file.py'
Dec 05 05:58:38 compute-0 sudo[92297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:38 compute-0 python3.9[92299]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:38 compute-0 sudo[92297]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:38 compute-0 sudo[92449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijlarjayiszijbzpuhzkmloogxuyykjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914318.3423908-880-79477970932447/AnsiballZ_systemd.py'
Dec 05 05:58:38 compute-0 sudo[92449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:38 compute-0 python3.9[92451]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:58:38 compute-0 systemd[1]: Reloading.
Dec 05 05:58:38 compute-0 systemd-sysv-generator[92478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:58:38 compute-0 systemd-rc-local-generator[92475]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:58:38 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 05:58:38 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 05:58:38 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 05:58:38 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 05:58:39 compute-0 sudo[92449]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:39 compute-0 sudo[92642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pascrbkzgatazyogskyqrlgajqledaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914319.2122285-900-180551858873028/AnsiballZ_file.py'
Dec 05 05:58:39 compute-0 sudo[92642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:39 compute-0 python3.9[92644]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:39 compute-0 sudo[92642]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:39 compute-0 sudo[92794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wklyndapyjchqorbmeadlmrxxietkraq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914319.667667-916-252074877604477/AnsiballZ_stat.py'
Dec 05 05:58:39 compute-0 sudo[92794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:39 compute-0 python3.9[92796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:39 compute-0 sudo[92794]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:40 compute-0 sudo[92917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irkrlhvizgxuyfmayemnozeuvyrqeyhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914319.667667-916-252074877604477/AnsiballZ_copy.py'
Dec 05 05:58:40 compute-0 sudo[92917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:40 compute-0 python3.9[92919]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914319.667667-916-252074877604477/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:40 compute-0 sudo[92917]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:40 compute-0 sudo[93069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfficwtfjperrgnfptmhpytckeaezdgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914320.6904736-950-222544136234919/AnsiballZ_file.py'
Dec 05 05:58:40 compute-0 sudo[93069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:41 compute-0 python3.9[93071]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:41 compute-0 sudo[93069]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:41 compute-0 sudo[93221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjcthzkixfvjvlufwbypydofrmmgqnuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914321.1576357-966-186249830274403/AnsiballZ_stat.py'
Dec 05 05:58:41 compute-0 sudo[93221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:41 compute-0 python3.9[93223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:58:41 compute-0 sudo[93221]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:41 compute-0 sudo[93344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzlsgumtpmgnqxpuisywifxwdlenpboz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914321.1576357-966-186249830274403/AnsiballZ_copy.py'
Dec 05 05:58:41 compute-0 sudo[93344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:41 compute-0 python3.9[93346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914321.1576357-966-186249830274403/.source.json _original_basename=.apz6ev68 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:41 compute-0 sudo[93344]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:42 compute-0 sudo[93496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scgghsyobndqkrmlnsdrxiuzvvkkyqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914322.0861876-996-174986907553162/AnsiballZ_file.py'
Dec 05 05:58:42 compute-0 sudo[93496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:42 compute-0 python3.9[93498]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:42 compute-0 sudo[93496]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:42 compute-0 sudo[93648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggiiyuksgfqocdhzigfmcvvrbuvpqshp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914322.5683658-1012-259982568252842/AnsiballZ_stat.py'
Dec 05 05:58:42 compute-0 sudo[93648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:42 compute-0 sudo[93648]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:43 compute-0 sudo[93771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrjtdofpkpvcmcsqldoadukjkdorkvys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914322.5683658-1012-259982568252842/AnsiballZ_copy.py'
Dec 05 05:58:43 compute-0 sudo[93771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:43 compute-0 sudo[93771]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:43 compute-0 sudo[93923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwqcojhcqvuyurkpotyporkkoxspcwot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914323.5221944-1046-183569076237485/AnsiballZ_container_config_data.py'
Dec 05 05:58:43 compute-0 sudo[93923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:43 compute-0 python3.9[93925]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 05 05:58:43 compute-0 sudo[93923]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:44 compute-0 sudo[94075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwtyzdvlslfzrxxblmeyfipusgkqaxgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914324.1274776-1064-20262299533335/AnsiballZ_container_config_hash.py'
Dec 05 05:58:44 compute-0 sudo[94075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:44 compute-0 python3.9[94077]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 05:58:44 compute-0 sudo[94075]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:45 compute-0 sudo[94227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iegjnnmgqejjodsltanebwwapjqevmrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914324.7436762-1082-179123074562757/AnsiballZ_podman_container_info.py'
Dec 05 05:58:45 compute-0 sudo[94227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:45 compute-0 python3.9[94229]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 05:58:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:58:45 compute-0 sudo[94227]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:45 compute-0 sudo[94390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrfqadomjsslgzptwofohziypcwexuxb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914325.575892-1108-67598527856679/AnsiballZ_edpm_container_manage.py'
Dec 05 05:58:45 compute-0 sudo[94390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:46 compute-0 python3[94392]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 05:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:58:46 compute-0 podman[94421]: 2025-12-05 05:58:46.245912454 +0000 UTC m=+0.028925219 container create 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Dec 05 05:58:46 compute-0 podman[94421]: 2025-12-05 05:58:46.23315969 +0000 UTC m=+0.016172475 image pull 8a34d4ae7a6c24e04826a1710ee4298adbc68547aa0db91d73c9de73375782b7 quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current
Dec 05 05:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:58:46 compute-0 python3[94392]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current
Dec 05 05:58:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 05 05:58:46 compute-0 sudo[94390]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:46 compute-0 sudo[94599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmisgwkefwbpnxackrywtlhnenbsbks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914326.4426396-1124-214187057216390/AnsiballZ_stat.py'
Dec 05 05:58:46 compute-0 sudo[94599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:46 compute-0 python3.9[94601]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:58:46 compute-0 sudo[94599]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:47 compute-0 sudo[94753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgnjkfylcylopouvpwpreatueyfxnqsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914326.9565876-1142-129754408378559/AnsiballZ_file.py'
Dec 05 05:58:47 compute-0 sudo[94753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:47 compute-0 python3.9[94755]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:47 compute-0 sudo[94753]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:47 compute-0 sudo[94829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcnvlvcdliiffkupqnlmkqichunkscyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914326.9565876-1142-129754408378559/AnsiballZ_stat.py'
Dec 05 05:58:47 compute-0 sudo[94829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:47 compute-0 python3.9[94831]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:58:47 compute-0 sudo[94829]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:47 compute-0 sudo[94980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahcbnggadgoomrtumjuviftgurcsirzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914327.6251912-1142-160521625514809/AnsiballZ_copy.py'
Dec 05 05:58:47 compute-0 sudo[94980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:48 compute-0 python3.9[94982]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914327.6251912-1142-160521625514809/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:58:48 compute-0 sudo[94980]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:48 compute-0 sudo[95056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfcytdllbgeypoevorzgffxtrgahdnpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914327.6251912-1142-160521625514809/AnsiballZ_systemd.py'
Dec 05 05:58:48 compute-0 sudo[95056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:48 compute-0 python3.9[95058]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 05:58:48 compute-0 systemd[1]: Reloading.
Dec 05 05:58:48 compute-0 systemd-rc-local-generator[95080]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:58:48 compute-0 systemd-sysv-generator[95083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:58:48 compute-0 sudo[95056]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:48 compute-0 sudo[95168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncispunefeggayvtbbbtojrmelykbyzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914327.6251912-1142-160521625514809/AnsiballZ_systemd.py'
Dec 05 05:58:48 compute-0 sudo[95168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:49 compute-0 python3.9[95170]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:58:49 compute-0 systemd[1]: Reloading.
Dec 05 05:58:49 compute-0 systemd-rc-local-generator[95195]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:58:49 compute-0 systemd-sysv-generator[95199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:58:49 compute-0 systemd[1]: Starting ovn_controller container...
Dec 05 05:58:49 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 05 05:58:49 compute-0 systemd[1]: Started libcrun container.
Dec 05 05:58:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b35b5166211f429792d0a326da071388c82dc542f21c0eaf674b45b49c5e236/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 05:58:49 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6.
Dec 05 05:58:49 compute-0 podman[95211]: 2025-12-05 05:58:49.327722636 +0000 UTC m=+0.082947670 container init 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + sudo -E kolla_set_configs
Dec 05 05:58:49 compute-0 podman[95211]: 2025-12-05 05:58:49.346956635 +0000 UTC m=+0.102181649 container start 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 05:58:49 compute-0 edpm-start-podman-container[95211]: ovn_controller
Dec 05 05:58:49 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 05 05:58:49 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 05 05:58:49 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 05 05:58:49 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 05 05:58:49 compute-0 edpm-start-podman-container[95210]: Creating additional drop-in dependency for "ovn_controller" (9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6)
Dec 05 05:58:49 compute-0 systemd[95252]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 05 05:58:49 compute-0 systemd[1]: Reloading.
Dec 05 05:58:49 compute-0 podman[95230]: 2025-12-05 05:58:49.423552673 +0000 UTC m=+0.067997335 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 05:58:49 compute-0 systemd[95252]: Queued start job for default target Main User Target.
Dec 05 05:58:49 compute-0 systemd-rc-local-generator[95300]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:58:49 compute-0 systemd[95252]: Created slice User Application Slice.
Dec 05 05:58:49 compute-0 systemd[95252]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 05 05:58:49 compute-0 systemd[95252]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 05:58:49 compute-0 systemd[95252]: Reached target Paths.
Dec 05 05:58:49 compute-0 systemd[95252]: Reached target Timers.
Dec 05 05:58:49 compute-0 systemd[95252]: Starting D-Bus User Message Bus Socket...
Dec 05 05:58:49 compute-0 systemd-sysv-generator[95303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:58:49 compute-0 systemd[95252]: Starting Create User's Volatile Files and Directories...
Dec 05 05:58:49 compute-0 systemd[95252]: Finished Create User's Volatile Files and Directories.
Dec 05 05:58:49 compute-0 systemd[95252]: Listening on D-Bus User Message Bus Socket.
Dec 05 05:58:49 compute-0 systemd[95252]: Reached target Sockets.
Dec 05 05:58:49 compute-0 systemd[95252]: Reached target Basic System.
Dec 05 05:58:49 compute-0 systemd[95252]: Reached target Main User Target.
Dec 05 05:58:49 compute-0 systemd[95252]: Startup finished in 81ms.
Dec 05 05:58:49 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 05 05:58:49 compute-0 systemd[1]: Started ovn_controller container.
Dec 05 05:58:49 compute-0 systemd[1]: 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6-40b962e91cf2d75.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 05:58:49 compute-0 systemd[1]: 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6-40b962e91cf2d75.service: Failed with result 'exit-code'.
Dec 05 05:58:49 compute-0 systemd[1]: Started Session c1 of User root.
Dec 05 05:58:49 compute-0 sudo[95168]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:49 compute-0 ovn_controller[95223]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 05:58:49 compute-0 ovn_controller[95223]: INFO:__main__:Validating config file
Dec 05 05:58:49 compute-0 ovn_controller[95223]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 05:58:49 compute-0 ovn_controller[95223]: INFO:__main__:Writing out command to execute
Dec 05 05:58:49 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 05 05:58:49 compute-0 ovn_controller[95223]: ++ cat /run_command
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + ARGS=
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + sudo kolla_copy_cacerts
Dec 05 05:58:49 compute-0 systemd[1]: Started Session c2 of User root.
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + [[ ! -n '' ]]
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + . kolla_extend_start
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 05 05:58:49 compute-0 ovn_controller[95223]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + umask 0022
Dec 05 05:58:49 compute-0 ovn_controller[95223]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 05 05:58:49 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Dec 05 05:58:49 compute-0 ovn_controller[95223]: 2025-12-05T05:58:49Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Dec 05 05:58:49 compute-0 NetworkManager[55434]: <info>  [1764914329.6934] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 05 05:58:49 compute-0 NetworkManager[55434]: <info>  [1764914329.6938] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 05 05:58:49 compute-0 NetworkManager[55434]: <info>  [1764914329.6944] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 05 05:58:49 compute-0 NetworkManager[55434]: <info>  [1764914329.6947] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 05 05:58:49 compute-0 NetworkManager[55434]: <info>  [1764914329.6949] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 05 05:58:49 compute-0 kernel: br-int: entered promiscuous mode
Dec 05 05:58:49 compute-0 systemd-udevd[95359]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:58:49 compute-0 sudo[95481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcuscqmvxampgouxvfhkhhhiihnxkyhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914329.7354057-1198-46196341275630/AnsiballZ_command.py'
Dec 05 05:58:49 compute-0 sudo[95481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:50 compute-0 python3.9[95483]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:50 compute-0 ovs-vsctl[95484]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 05 05:58:50 compute-0 sudo[95481]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:50 compute-0 sudo[95634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnicqpaycfeicimyzyysykcwnvltihri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914330.197449-1214-69237274104122/AnsiballZ_command.py'
Dec 05 05:58:50 compute-0 sudo[95634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:50 compute-0 python3.9[95636]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:50 compute-0 ovs-vsctl[95638]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 05 05:58:50 compute-0 sudo[95634]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00025|main|INFO|OVS feature set changed, force recompute.
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00034|features|INFO|OVS Feature: group_support, state: supported
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00035|main|INFO|OVS feature set changed, force recompute.
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 05 05:58:50 compute-0 ovn_controller[95223]: 2025-12-05T05:58:50Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 05 05:58:50 compute-0 NetworkManager[55434]: <info>  [1764914330.7079] manager: (ovn-ac6eca-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 05 05:58:50 compute-0 systemd-udevd[95377]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 05:58:50 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 05 05:58:50 compute-0 NetworkManager[55434]: <info>  [1764914330.7223] device (genev_sys_6081): carrier: link connected
Dec 05 05:58:50 compute-0 NetworkManager[55434]: <info>  [1764914330.7225] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 05 05:58:50 compute-0 NetworkManager[55434]: <info>  [1764914330.7344] manager: (ovn-780dfa-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 05 05:58:51 compute-0 sudo[95792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnxdhlaillminqhnehzcppfvbpaunmqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914330.8967793-1242-272604949069253/AnsiballZ_command.py'
Dec 05 05:58:51 compute-0 sudo[95792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:51 compute-0 python3.9[95794]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:58:51 compute-0 ovs-vsctl[95795]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 05 05:58:51 compute-0 sudo[95792]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:51 compute-0 sshd-session[84749]: Connection closed by 192.168.122.30 port 55230
Dec 05 05:58:51 compute-0 sshd-session[84746]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:58:51 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 05 05:58:51 compute-0 systemd[1]: session-19.scope: Consumed 30.602s CPU time.
Dec 05 05:58:51 compute-0 systemd-logind[745]: Session 19 logged out. Waiting for processes to exit.
Dec 05 05:58:51 compute-0 systemd-logind[745]: Removed session 19.
Dec 05 05:58:56 compute-0 sshd-session[95820]: Accepted publickey for zuul from 192.168.122.30 port 49074 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:58:56 compute-0 systemd-logind[745]: New session 21 of user zuul.
Dec 05 05:58:56 compute-0 systemd[1]: Started Session 21 of User zuul.
Dec 05 05:58:56 compute-0 sshd-session[95820]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:58:56 compute-0 python3.9[95973]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:58:57 compute-0 sudo[96127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvrxjfbkbhevwfwilseslrmugtixieqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914337.228706-48-74463425445763/AnsiballZ_file.py'
Dec 05 05:58:57 compute-0 sudo[96127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:57 compute-0 python3.9[96129]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:57 compute-0 sudo[96127]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:57 compute-0 sudo[96279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlmkmomfrrphvyagmxznwyjahhcbdyat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914337.802799-48-265751684600877/AnsiballZ_file.py'
Dec 05 05:58:57 compute-0 sudo[96279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:58 compute-0 python3.9[96281]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:58 compute-0 sudo[96279]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:58 compute-0 sudo[96431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdumvpmruoiyjxfaqkyqmtfobpuburqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914338.2458913-48-229798824199788/AnsiballZ_file.py'
Dec 05 05:58:58 compute-0 sudo[96431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:58 compute-0 python3.9[96433]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:58 compute-0 sudo[96431]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:58 compute-0 sudo[96583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwavajhbahitnfwgidakmbnbgcrdaggq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914338.6814737-48-144453061218745/AnsiballZ_file.py'
Dec 05 05:58:58 compute-0 sudo[96583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:59 compute-0 python3.9[96585]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:59 compute-0 sudo[96583]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:59 compute-0 sudo[96735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itinqnwzppicuxuxlkuxwuapzsfmrubw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914339.1357405-48-251602021903846/AnsiballZ_file.py'
Dec 05 05:58:59 compute-0 sudo[96735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:58:59 compute-0 python3.9[96737]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:58:59 compute-0 sudo[96735]: pam_unix(sudo:session): session closed for user root
Dec 05 05:58:59 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 05 05:58:59 compute-0 systemd[95252]: Activating special unit Exit the Session...
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped target Main User Target.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped target Basic System.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped target Paths.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped target Sockets.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped target Timers.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 05:58:59 compute-0 systemd[95252]: Closed D-Bus User Message Bus Socket.
Dec 05 05:58:59 compute-0 systemd[95252]: Stopped Create User's Volatile Files and Directories.
Dec 05 05:58:59 compute-0 systemd[95252]: Removed slice User Application Slice.
Dec 05 05:58:59 compute-0 systemd[95252]: Reached target Shutdown.
Dec 05 05:58:59 compute-0 systemd[95252]: Finished Exit the Session.
Dec 05 05:58:59 compute-0 systemd[95252]: Reached target Exit the Session.
Dec 05 05:58:59 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 05 05:58:59 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 05 05:58:59 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 05 05:58:59 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 05 05:58:59 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 05 05:58:59 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 05 05:58:59 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 05 05:59:00 compute-0 python3.9[96888]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:59:00 compute-0 sudo[97038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvdynbctjlonlaagchxuodwhzjnodbzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914340.1554534-136-117691599640770/AnsiballZ_seboolean.py'
Dec 05 05:59:00 compute-0 sudo[97038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:00 compute-0 python3.9[97040]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 05 05:59:01 compute-0 sudo[97038]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:01 compute-0 python3.9[97190]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:02 compute-0 python3.9[97311]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914341.2507615-152-15875801042622/.source follow=False _original_basename=haproxy.j2 checksum=9b8332c0ab50981c23ff3c1c8d4c4c389ef4b0e5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:02 compute-0 python3.9[97461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:02 compute-0 python3.9[97582]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914342.3008385-182-115756305802293/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:03 compute-0 sudo[97732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sszbzitkhhdsbuzjxneittbeucbakkel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914343.213912-216-130074926821444/AnsiballZ_setup.py'
Dec 05 05:59:03 compute-0 sudo[97732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:03 compute-0 python3.9[97734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:59:03 compute-0 sudo[97732]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:04 compute-0 sudo[97816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueciufkxqikragzfwdynagzcktdvjbdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914343.213912-216-130074926821444/AnsiballZ_dnf.py'
Dec 05 05:59:04 compute-0 sudo[97816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:04 compute-0 python3.9[97818]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:59:05 compute-0 sudo[97816]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:05 compute-0 sudo[97969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxntsmwxucvagzhnwvwlwwdbmzdlwgnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914345.3976874-240-204736318181779/AnsiballZ_systemd.py'
Dec 05 05:59:05 compute-0 sudo[97969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:06 compute-0 python3.9[97971]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 05:59:06 compute-0 sudo[97969]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:06 compute-0 python3.9[98124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:07 compute-0 python3.9[98245]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914346.291357-256-275963735191067/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:07 compute-0 python3.9[98395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:07 compute-0 python3.9[98516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914347.167661-256-58200204119284/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:08 compute-0 python3.9[98666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:09 compute-0 python3.9[98788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914348.466044-344-207846813846658/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:09 compute-0 python3.9[98938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:10 compute-0 python3.9[99059]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914349.3523808-344-232347112175531/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:10 compute-0 python3.9[99209]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:59:10 compute-0 sudo[99361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aktmnoqqueicaesuuhturjracxqgclon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914350.7033157-420-199672802672125/AnsiballZ_file.py'
Dec 05 05:59:10 compute-0 sudo[99361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:11 compute-0 python3.9[99363]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:11 compute-0 sudo[99361]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:11 compute-0 sudo[99513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjhcshezqgiohdgqfhbmydqjktgnjkiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914351.1661937-436-255238677969016/AnsiballZ_stat.py'
Dec 05 05:59:11 compute-0 sudo[99513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:11 compute-0 python3.9[99515]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:11 compute-0 sudo[99513]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:11 compute-0 sudo[99591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozgrmvxiabpjncqwooogksetkcjjcmmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914351.1661937-436-255238677969016/AnsiballZ_file.py'
Dec 05 05:59:11 compute-0 sudo[99591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:11 compute-0 python3.9[99593]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:11 compute-0 sudo[99591]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:12 compute-0 sudo[99743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aircpjpizdpqvsjlzoxhkcxgcfvxlnmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914351.9128456-436-45699119766251/AnsiballZ_stat.py'
Dec 05 05:59:12 compute-0 sudo[99743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:12 compute-0 python3.9[99745]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:12 compute-0 sudo[99743]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:12 compute-0 sudo[99821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zagvoxhnncidzsxdeiohlbxkevljtigy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914351.9128456-436-45699119766251/AnsiballZ_file.py'
Dec 05 05:59:12 compute-0 sudo[99821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:12 compute-0 python3.9[99823]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:12 compute-0 sudo[99821]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:12 compute-0 sudo[99973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teclwmdeooaobwocjarhymgewbnbhqdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914352.6655204-482-183900382730156/AnsiballZ_file.py'
Dec 05 05:59:12 compute-0 sudo[99973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:12 compute-0 python3.9[99975]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:13 compute-0 sudo[99973]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:13 compute-0 sudo[100125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsgepiclhrpkqnxkcbjaepisxzhjimgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914353.1108775-498-35991060612683/AnsiballZ_stat.py'
Dec 05 05:59:13 compute-0 sudo[100125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:13 compute-0 python3.9[100127]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:13 compute-0 sudo[100125]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:13 compute-0 sudo[100203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlifgnwjlgirvvkxyaqgnvgekziecmyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914353.1108775-498-35991060612683/AnsiballZ_file.py'
Dec 05 05:59:13 compute-0 sudo[100203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:13 compute-0 python3.9[100205]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:13 compute-0 sudo[100203]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:14 compute-0 sudo[100355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlaelyhdzycetchnifmtvugvzrxwwgyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914353.874174-522-213784400084941/AnsiballZ_stat.py'
Dec 05 05:59:14 compute-0 sudo[100355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:14 compute-0 python3.9[100357]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:14 compute-0 sudo[100355]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:14 compute-0 sudo[100433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhcglhfuqjyttmcrwyberswxrfhnbuqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914353.874174-522-213784400084941/AnsiballZ_file.py'
Dec 05 05:59:14 compute-0 sudo[100433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:14 compute-0 python3.9[100435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:14 compute-0 sudo[100433]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:14 compute-0 sudo[100585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubvddxupxfhetykcpmphjmxcjiedfynq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914354.632948-546-42758048927803/AnsiballZ_systemd.py'
Dec 05 05:59:14 compute-0 sudo[100585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:15 compute-0 python3.9[100587]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:15 compute-0 systemd[1]: Reloading.
Dec 05 05:59:15 compute-0 systemd-rc-local-generator[100607]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:15 compute-0 systemd-sysv-generator[100611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:15 compute-0 sudo[100585]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:15 compute-0 sudo[100774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzuwclkitfgvybfkiwxjzfamuhobnhlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914355.3760395-562-248686766676652/AnsiballZ_stat.py'
Dec 05 05:59:15 compute-0 sudo[100774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:15 compute-0 python3.9[100776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:15 compute-0 sudo[100774]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:15 compute-0 sudo[100852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkpvdcnzvjfolkycyalafceaycysjfpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914355.3760395-562-248686766676652/AnsiballZ_file.py'
Dec 05 05:59:15 compute-0 sudo[100852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:16 compute-0 python3.9[100854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:16 compute-0 sudo[100852]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:16 compute-0 sudo[101004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfusrrslcvvbymaowaczcahxaukjlur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914356.1425242-586-143894717931139/AnsiballZ_stat.py'
Dec 05 05:59:16 compute-0 sudo[101004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:16 compute-0 python3.9[101006]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:16 compute-0 sudo[101004]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:16 compute-0 sudo[101082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeohfmbifmssjbajgyzwgtpynmfmxzfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914356.1425242-586-143894717931139/AnsiballZ_file.py'
Dec 05 05:59:16 compute-0 sudo[101082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:16 compute-0 python3.9[101084]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:16 compute-0 sudo[101082]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:17 compute-0 sudo[101234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sptohynqxxrzvogxhmxkzyszoemkjkbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914356.9187415-610-60260596115015/AnsiballZ_systemd.py'
Dec 05 05:59:17 compute-0 sudo[101234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:17 compute-0 python3.9[101236]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:17 compute-0 systemd[1]: Reloading.
Dec 05 05:59:17 compute-0 systemd-rc-local-generator[101257]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:17 compute-0 systemd-sysv-generator[101260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:17 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 05:59:17 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 05:59:17 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 05:59:17 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 05:59:17 compute-0 sudo[101234]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:17 compute-0 sudo[101427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eumymphkrufgwrbbumdayqpbkgglhlnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914357.773748-630-47092713740174/AnsiballZ_file.py'
Dec 05 05:59:17 compute-0 sudo[101427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:18 compute-0 python3.9[101429]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:18 compute-0 sudo[101427]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:18 compute-0 sudo[101579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koqliglvrdufwdpoidzhaliklrqbyvya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914358.2080379-646-164504105473190/AnsiballZ_stat.py'
Dec 05 05:59:18 compute-0 sudo[101579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:18 compute-0 python3.9[101581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:18 compute-0 sudo[101579]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:18 compute-0 sudo[101702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inqsztcsqwmyfvzmkduhxtnsnuetxwbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914358.2080379-646-164504105473190/AnsiballZ_copy.py'
Dec 05 05:59:18 compute-0 sudo[101702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:18 compute-0 python3.9[101704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914358.2080379-646-164504105473190/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:18 compute-0 sudo[101702]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:19 compute-0 sudo[101854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eybgawdnbsamsyjwddfeiuzxvapzyoax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914359.1548603-680-164833401106458/AnsiballZ_file.py'
Dec 05 05:59:19 compute-0 sudo[101854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:19 compute-0 python3.9[101856]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 05:59:19 compute-0 sudo[101854]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:19 compute-0 sudo[102015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wactqreevsqpyjwcpchwvtvqpkjiwpnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914359.6217937-696-225417922035816/AnsiballZ_stat.py'
Dec 05 05:59:19 compute-0 sudo[102015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:19 compute-0 ovn_controller[95223]: 2025-12-05T05:59:19Z|00038|memory|INFO|16128 kB peak resident set size after 30.1 seconds
Dec 05 05:59:19 compute-0 ovn_controller[95223]: 2025-12-05T05:59:19Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec 05 05:59:19 compute-0 podman[101980]: 2025-12-05 05:59:19.847454224 +0000 UTC m=+0.086451086 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 05:59:19 compute-0 python3.9[102024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 05:59:19 compute-0 sudo[102015]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:20 compute-0 sudo[102153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqambiuegkjyeaqjfwbseipbyscdbfrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914359.6217937-696-225417922035816/AnsiballZ_copy.py'
Dec 05 05:59:20 compute-0 sudo[102153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:20 compute-0 python3.9[102155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914359.6217937-696-225417922035816/.source.json _original_basename=.52iz_75j follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:20 compute-0 sudo[102153]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:20 compute-0 sudo[102305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsapolpqreaqoxfgxtocywmmozcfvoxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914360.4461644-726-91384334146711/AnsiballZ_file.py'
Dec 05 05:59:20 compute-0 sudo[102305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:20 compute-0 python3.9[102307]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:20 compute-0 sudo[102305]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:21 compute-0 sudo[102457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akqxmmnxuutciqthfjlwlnxnuuekmkou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914360.9093704-742-11730968420426/AnsiballZ_stat.py'
Dec 05 05:59:21 compute-0 sudo[102457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:21 compute-0 sudo[102457]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:21 compute-0 sudo[102580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qghbxkuhxwdhelseelxvvrlhfhiduxne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914360.9093704-742-11730968420426/AnsiballZ_copy.py'
Dec 05 05:59:21 compute-0 sudo[102580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:21 compute-0 sudo[102580]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:22 compute-0 sudo[102732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsnfkaxibvdiqufpqsbmihaucoteddom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914361.8061981-776-231725696966013/AnsiballZ_container_config_data.py'
Dec 05 05:59:22 compute-0 sudo[102732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:22 compute-0 python3.9[102734]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 05 05:59:22 compute-0 sudo[102732]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:22 compute-0 sudo[102884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unxzuyflzcsojfjpxpjurpkeoolrfbcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914362.397431-794-116475902324905/AnsiballZ_container_config_hash.py'
Dec 05 05:59:22 compute-0 sudo[102884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:22 compute-0 python3.9[102886]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 05:59:22 compute-0 sudo[102884]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:23 compute-0 sudo[103036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feqgwnmcdazzrszwlvjogedvwraisgkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914363.0111122-812-162836408939336/AnsiballZ_podman_container_info.py'
Dec 05 05:59:23 compute-0 sudo[103036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:23 compute-0 python3.9[103038]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 05:59:23 compute-0 sudo[103036]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:24 compute-0 sudo[103207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcivoxuigievwvedvyekexxrnddpclcm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914363.893741-838-201868023319027/AnsiballZ_edpm_container_manage.py'
Dec 05 05:59:24 compute-0 sudo[103207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:24 compute-0 python3[103209]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 05:59:24 compute-0 podman[103238]: 2025-12-05 05:59:24.560815357 +0000 UTC m=+0.029663567 container create 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 05 05:59:24 compute-0 podman[103238]: 2025-12-05 05:59:24.545669213 +0000 UTC m=+0.014517444 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 05:59:24 compute-0 python3[103209]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 05:59:24 compute-0 sudo[103207]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:24 compute-0 sudo[103414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnkroaxiufsnjmursipgfioiojtssewt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914364.7635722-854-69313214289589/AnsiballZ_stat.py'
Dec 05 05:59:24 compute-0 sudo[103414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:25 compute-0 python3.9[103416]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:59:25 compute-0 sudo[103414]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:25 compute-0 sudo[103568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctujvikbqxduhgpbjyzlksjboutwovqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914365.276031-872-36679145854814/AnsiballZ_file.py'
Dec 05 05:59:25 compute-0 sudo[103568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:25 compute-0 python3.9[103570]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:25 compute-0 sudo[103568]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:25 compute-0 sudo[103644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-morupzqwinteqqerbhyrdoeyiyjytrzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914365.276031-872-36679145854814/AnsiballZ_stat.py'
Dec 05 05:59:25 compute-0 sudo[103644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:25 compute-0 python3.9[103646]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 05:59:25 compute-0 sudo[103644]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:26 compute-0 sudo[103795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btcimircvijmjrpovpefriazzxcshfnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914365.9572847-872-226329797045200/AnsiballZ_copy.py'
Dec 05 05:59:26 compute-0 sudo[103795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:26 compute-0 python3.9[103797]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914365.9572847-872-226329797045200/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:26 compute-0 sudo[103795]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:26 compute-0 sudo[103871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spvpyuxaeoihfmxilgzzsxudozwfedlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914365.9572847-872-226329797045200/AnsiballZ_systemd.py'
Dec 05 05:59:26 compute-0 sudo[103871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:26 compute-0 python3.9[103873]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 05:59:26 compute-0 systemd[1]: Reloading.
Dec 05 05:59:26 compute-0 systemd-rc-local-generator[103898]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:26 compute-0 systemd-sysv-generator[103902]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:27 compute-0 sudo[103871]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:27 compute-0 sudo[103981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxeetxhvkxqgotvnomzmjcrcqxtzoqml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914365.9572847-872-226329797045200/AnsiballZ_systemd.py'
Dec 05 05:59:27 compute-0 sudo[103981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:27 compute-0 python3.9[103983]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:27 compute-0 systemd[1]: Reloading.
Dec 05 05:59:27 compute-0 systemd-sysv-generator[104009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:27 compute-0 systemd-rc-local-generator[104006]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:27 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 05 05:59:27 compute-0 systemd[1]: Started libcrun container.
Dec 05 05:59:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc70be0dee51222d91eac084ff3effb186244af2ec479e999fa6e836c47a3ce5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 05 05:59:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dc70be0dee51222d91eac084ff3effb186244af2ec479e999fa6e836c47a3ce5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 05:59:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22.
Dec 05 05:59:27 compute-0 podman[104024]: 2025-12-05 05:59:27.760988598 +0000 UTC m=+0.078448292 container init 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent)
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + sudo -E kolla_set_configs
Dec 05 05:59:27 compute-0 edpm-start-podman-container[104024]: ovn_metadata_agent
Dec 05 05:59:27 compute-0 podman[104024]: 2025-12-05 05:59:27.780887888 +0000 UTC m=+0.098347561 container start 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 05 05:59:27 compute-0 edpm-start-podman-container[104023]: Creating additional drop-in dependency for "ovn_metadata_agent" (09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22)
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Validating config file
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Copying service configuration files
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Writing out command to execute
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: ++ cat /run_command
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + CMD=neutron-ovn-metadata-agent
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + ARGS=
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + sudo kolla_copy_cacerts
Dec 05 05:59:27 compute-0 systemd[1]: Reloading.
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + [[ ! -n '' ]]
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + . kolla_extend_start
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: Running command: 'neutron-ovn-metadata-agent'
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + umask 0022
Dec 05 05:59:27 compute-0 ovn_metadata_agent[104036]: + exec neutron-ovn-metadata-agent
Dec 05 05:59:27 compute-0 podman[104043]: 2025-12-05 05:59:27.877317521 +0000 UTC m=+0.084354704 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 05:59:27 compute-0 systemd-rc-local-generator[104104]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:27 compute-0 systemd-sysv-generator[104109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:28 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 05 05:59:28 compute-0 sudo[103981]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:28 compute-0 sshd-session[95823]: Connection closed by 192.168.122.30 port 49074
Dec 05 05:59:28 compute-0 sshd-session[95820]: pam_unix(sshd:session): session closed for user zuul
Dec 05 05:59:28 compute-0 systemd-logind[745]: Session 21 logged out. Waiting for processes to exit.
Dec 05 05:59:28 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Dec 05 05:59:28 compute-0 systemd[1]: session-21.scope: Consumed 24.196s CPU time.
Dec 05 05:59:28 compute-0 systemd-logind[745]: Removed session 21.
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.451 104041 INFO neutron.common.config [-] Logging enabled!
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.451 104041 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.451 104041 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.452 104041 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.453 104041 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.454 104041 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.455 104041 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.456 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 192.168.25.227 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.457 104041 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.458 104041 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.459 104041 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.460 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.461 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.462 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.463 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.464 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.465 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.466 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.467 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.468 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.469 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.470 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.471 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.472 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.473 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.474 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.475 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.476 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.477 104041 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.484 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.484 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.484 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.484 104041 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.484 104041 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.492 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 89d40815-76f5-4f1d-9077-84d831b7d6c4 (UUID: 89d40815-76f5-4f1d-9077-84d831b7d6c4) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.509 104041 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.509 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.509 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.509 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.509 104041 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.512 104041 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.515 104041 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.519 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '89d40815-76f5-4f1d-9077-84d831b7d6c4'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], external_ids={}, name=89d40815-76f5-4f1d-9077-84d831b7d6c4, nb_cfg_timestamp=1764914338713, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 05:59:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:29.521 104041 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpgdsiruyj/privsep.sock']
Dec 05 05:59:30 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.092 104041 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.093 104041 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgdsiruyj/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.003 104153 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.006 104153 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.008 104153 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.008 104153 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104153
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.094 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b0e4b8-4811-4f25-95cb-1baeb3e52d18]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.479 104153 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.479 104153 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.479 104153 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.857 104153 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.862 104153 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.896 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[faa6dd3d-06bf-4914-9a09-c7fcd68d2fde]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.897 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, column=external_ids, values=({'neutron:ovn-metadata-id': 'c4189367-1936-5974-a10a-37cb40662e7f'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.901 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 05:59:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 05:59:30.910 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 05:59:33 compute-0 sshd-session[104158]: Accepted publickey for zuul from 192.168.122.30 port 48320 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 05:59:33 compute-0 systemd-logind[745]: New session 22 of user zuul.
Dec 05 05:59:33 compute-0 systemd[1]: Started Session 22 of User zuul.
Dec 05 05:59:33 compute-0 sshd-session[104158]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 05:59:34 compute-0 python3.9[104311]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 05:59:34 compute-0 sudo[104465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfvwvqtyyquqzgebkfkzuokkkxavezlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914374.3827796-48-268627247082571/AnsiballZ_command.py'
Dec 05 05:59:34 compute-0 sudo[104465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:34 compute-0 python3.9[104467]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:34 compute-0 sudo[104465]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:35 compute-0 sudo[104626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slrfxzzljepptlilmjotuhislblukkiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914375.1246912-70-223077237550605/AnsiballZ_systemd_service.py'
Dec 05 05:59:35 compute-0 sudo[104626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:35 compute-0 python3.9[104628]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 05:59:35 compute-0 systemd[1]: Reloading.
Dec 05 05:59:35 compute-0 systemd-rc-local-generator[104648]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:35 compute-0 systemd-sysv-generator[104652]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:35 compute-0 sudo[104626]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:36 compute-0 python3.9[104813]: ansible-ansible.builtin.service_facts Invoked
Dec 05 05:59:36 compute-0 network[104830]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 05:59:36 compute-0 network[104831]: 'network-scripts' will be removed from distribution in near future.
Dec 05 05:59:36 compute-0 network[104832]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 05:59:38 compute-0 sudo[105091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-allavogavueidzxdjnrmytdoqyvtaqgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914378.4984043-108-221920960138451/AnsiballZ_systemd_service.py'
Dec 05 05:59:38 compute-0 sudo[105091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:38 compute-0 python3.9[105093]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:38 compute-0 sudo[105091]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:39 compute-0 sudo[105244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqopofqurttukbumsdbxkbxpsffqtwjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914379.038394-108-95221261512044/AnsiballZ_systemd_service.py'
Dec 05 05:59:39 compute-0 sudo[105244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:39 compute-0 python3.9[105246]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:39 compute-0 sudo[105244]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:39 compute-0 sudo[105397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbyuactviagobazfjejcxibnqafqypwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914379.5893533-108-269041326433045/AnsiballZ_systemd_service.py'
Dec 05 05:59:39 compute-0 sudo[105397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:40 compute-0 python3.9[105399]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:40 compute-0 sudo[105397]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:40 compute-0 sudo[105550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxviksytewywanpexkzvwlaxvopyeswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914380.127386-108-245612953293540/AnsiballZ_systemd_service.py'
Dec 05 05:59:40 compute-0 sudo[105550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:40 compute-0 python3.9[105552]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:40 compute-0 sudo[105550]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:40 compute-0 sudo[105703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucwmzwputjpizoosvfvlopzzbqoplkss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914380.6562157-108-55348704958576/AnsiballZ_systemd_service.py'
Dec 05 05:59:40 compute-0 sudo[105703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:41 compute-0 python3.9[105705]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:41 compute-0 sudo[105703]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:41 compute-0 sudo[105856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ximcfuepzfntqrhmekfdpmpnnaeojwcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914381.1839733-108-262029600253436/AnsiballZ_systemd_service.py'
Dec 05 05:59:41 compute-0 sudo[105856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:41 compute-0 python3.9[105858]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:41 compute-0 sudo[105856]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:41 compute-0 sudo[106009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwclodxquvmnwslmtdmfaewccrukzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914381.7004728-108-143779303174326/AnsiballZ_systemd_service.py'
Dec 05 05:59:41 compute-0 sudo[106009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:42 compute-0 python3.9[106011]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 05:59:42 compute-0 sudo[106009]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:42 compute-0 sudo[106162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmtdbyqxpdszujojoxbyocqjwnxngwtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914382.3707151-212-232718067472867/AnsiballZ_file.py'
Dec 05 05:59:42 compute-0 sudo[106162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:42 compute-0 python3.9[106164]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:42 compute-0 sudo[106162]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:43 compute-0 sudo[106314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjqtpgkeimywtlkvovgqjobnijeyqsvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914382.9008803-212-172465305341400/AnsiballZ_file.py'
Dec 05 05:59:43 compute-0 sudo[106314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:43 compute-0 python3.9[106316]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:43 compute-0 sudo[106314]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:43 compute-0 sudo[106466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbfffbkoeswvhybrircwvekvmvccdwhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914383.3004658-212-271095305903957/AnsiballZ_file.py'
Dec 05 05:59:43 compute-0 sudo[106466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:43 compute-0 python3.9[106468]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:43 compute-0 sudo[106466]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:43 compute-0 sudo[106618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlytefyqdrfllbvuobdrnsnqhesyvryb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914383.7186215-212-126111058838704/AnsiballZ_file.py'
Dec 05 05:59:43 compute-0 sudo[106618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:44 compute-0 python3.9[106620]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:44 compute-0 sudo[106618]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:44 compute-0 sudo[106770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuevimtkihnavvaobjcifwcybnlwqszg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914384.1240158-212-91918331764810/AnsiballZ_file.py'
Dec 05 05:59:44 compute-0 sudo[106770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:44 compute-0 python3.9[106772]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:44 compute-0 sudo[106770]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:44 compute-0 sudo[106922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftykjpepjfjvoxqnvwxowkidhkxmsncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914384.5304878-212-187589136622065/AnsiballZ_file.py'
Dec 05 05:59:44 compute-0 sudo[106922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:44 compute-0 python3.9[106924]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:44 compute-0 sudo[106922]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:45 compute-0 sudo[107074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kchukirirlrrjaczmgazhzswsqusznky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914384.9304543-212-208102158133710/AnsiballZ_file.py'
Dec 05 05:59:45 compute-0 sudo[107074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:45 compute-0 python3.9[107076]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:45 compute-0 sudo[107074]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:45 compute-0 sudo[107226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlysheuspunpaxgwjyjgxbunqenxxfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914385.3707645-312-66815264983681/AnsiballZ_file.py'
Dec 05 05:59:45 compute-0 sudo[107226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:45 compute-0 python3.9[107228]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:45 compute-0 sudo[107226]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:45 compute-0 sudo[107378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozblnmcemgsxfgizhhccjdumyjcswygx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914385.7731266-312-255209796473558/AnsiballZ_file.py'
Dec 05 05:59:45 compute-0 sudo[107378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:46 compute-0 python3.9[107380]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:46 compute-0 sudo[107378]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:46 compute-0 sudo[107530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wscbvlqtajkmvzyfvylaybkewhtdhanj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914386.1708245-312-61389172469353/AnsiballZ_file.py'
Dec 05 05:59:46 compute-0 sudo[107530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:46 compute-0 python3.9[107532]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:46 compute-0 sudo[107530]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:46 compute-0 sudo[107682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpqhqvdkgwnsusryqugfxrzngyjwnuex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914386.5745208-312-166862406474430/AnsiballZ_file.py'
Dec 05 05:59:46 compute-0 sudo[107682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:46 compute-0 python3.9[107684]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:46 compute-0 sudo[107682]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:47 compute-0 sudo[107834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnzlodtdjwluyzfxezpnqraafufwxgvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914386.9824398-312-121420070071133/AnsiballZ_file.py'
Dec 05 05:59:47 compute-0 sudo[107834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:47 compute-0 python3.9[107836]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:47 compute-0 sudo[107834]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:47 compute-0 sudo[107986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxmonxijcsjtiibtmjbldsrzvbbyrqbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914387.383804-312-130571130013970/AnsiballZ_file.py'
Dec 05 05:59:47 compute-0 sudo[107986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:47 compute-0 python3.9[107988]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:47 compute-0 sudo[107986]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:47 compute-0 sudo[108138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbpocvxenidxyamuitkpstfifddactdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914387.787179-312-266213375396724/AnsiballZ_file.py'
Dec 05 05:59:47 compute-0 sudo[108138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:48 compute-0 python3.9[108140]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 05:59:48 compute-0 sudo[108138]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:48 compute-0 sudo[108290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyygjsfkdryspxixgtjkxqsvxqdxmfqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914388.3467739-414-73793270185196/AnsiballZ_command.py'
Dec 05 05:59:48 compute-0 sudo[108290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:48 compute-0 python3.9[108292]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:48 compute-0 sudo[108290]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:49 compute-0 python3.9[108444]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 05:59:49 compute-0 sudo[108594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxostnmlkokijhszwbuluppkosarbepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914389.4169297-450-97534640576811/AnsiballZ_systemd_service.py'
Dec 05 05:59:49 compute-0 sudo[108594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:49 compute-0 python3.9[108596]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 05:59:49 compute-0 systemd[1]: Reloading.
Dec 05 05:59:49 compute-0 systemd-sysv-generator[108619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 05:59:49 compute-0 systemd-rc-local-generator[108616]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 05:59:50 compute-0 sudo[108594]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:50 compute-0 podman[108632]: 2025-12-05 05:59:50.092354965 +0000 UTC m=+0.065826511 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 05 05:59:50 compute-0 sudo[108804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uupzgpspwilxgyustvlkkmapdfzdsqxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914390.1254814-466-107477117235424/AnsiballZ_command.py'
Dec 05 05:59:50 compute-0 sudo[108804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:50 compute-0 python3.9[108806]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:50 compute-0 sudo[108804]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:50 compute-0 sudo[108957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nptwwyqeuveeqtfcilkphwzqntkaeuar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914390.5506184-466-39225181983359/AnsiballZ_command.py'
Dec 05 05:59:50 compute-0 sudo[108957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:50 compute-0 python3.9[108959]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:50 compute-0 sudo[108957]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:51 compute-0 sudo[109110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsnzrpmbjtavkkvcauiohtputkhljuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914390.969896-466-96384132103727/AnsiballZ_command.py'
Dec 05 05:59:51 compute-0 sudo[109110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:51 compute-0 python3.9[109112]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:51 compute-0 sudo[109110]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:51 compute-0 sudo[109263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iydrvuigqdkepucdakhgzkqkvbwqzuee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914391.3860066-466-202937399993125/AnsiballZ_command.py'
Dec 05 05:59:51 compute-0 sudo[109263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:51 compute-0 python3.9[109265]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:51 compute-0 sudo[109263]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:51 compute-0 sudo[109416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymntkjhmsykdslmktsxymvatnaxbeaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914391.8072724-466-113215880439955/AnsiballZ_command.py'
Dec 05 05:59:51 compute-0 sudo[109416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:52 compute-0 python3.9[109418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:52 compute-0 sudo[109416]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:52 compute-0 sudo[109569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zewznzrmfkuvdxdinguwtraqzjcutkqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914392.242139-466-142361720900190/AnsiballZ_command.py'
Dec 05 05:59:52 compute-0 sudo[109569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:52 compute-0 python3.9[109571]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:52 compute-0 sudo[109569]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:52 compute-0 sudo[109722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pssmgcpmgobwqlpufsukdyxofekwhgwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914392.6633937-466-217153150366935/AnsiballZ_command.py'
Dec 05 05:59:52 compute-0 sudo[109722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:52 compute-0 python3.9[109724]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 05:59:53 compute-0 sudo[109722]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:53 compute-0 sudo[109875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aucaoywspwoexzfomksihmileisrvsie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914393.4386654-574-139495176567417/AnsiballZ_getent.py'
Dec 05 05:59:53 compute-0 sudo[109875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:53 compute-0 python3.9[109877]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 05 05:59:53 compute-0 sudo[109875]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:54 compute-0 sudo[110028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkjxyessifdsqvplxttjxsuwxacuxlls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914394.0387058-590-22043280022621/AnsiballZ_group.py'
Dec 05 05:59:54 compute-0 sudo[110028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:54 compute-0 python3.9[110030]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 05:59:54 compute-0 groupadd[110031]: group added to /etc/group: name=libvirt, GID=42473
Dec 05 05:59:54 compute-0 groupadd[110031]: group added to /etc/gshadow: name=libvirt
Dec 05 05:59:54 compute-0 groupadd[110031]: new group: name=libvirt, GID=42473
Dec 05 05:59:54 compute-0 sudo[110028]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:55 compute-0 sudo[110186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqnmvqipcwbkgixoyqkhfpmefdbtglrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914394.7160494-606-173679578183562/AnsiballZ_user.py'
Dec 05 05:59:55 compute-0 sudo[110186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:55 compute-0 python3.9[110188]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 05:59:55 compute-0 useradd[110190]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 05:59:55 compute-0 sudo[110186]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:55 compute-0 sudo[110346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttwyxilsgkefafegxcadaamaeshsfimq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914395.6011832-628-138332350749912/AnsiballZ_setup.py'
Dec 05 05:59:55 compute-0 sudo[110346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:56 compute-0 python3.9[110348]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 05:59:56 compute-0 sudo[110346]: pam_unix(sudo:session): session closed for user root
Dec 05 05:59:56 compute-0 sudo[110430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdebbrccbsavlmlravilokekjyahkwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914395.6011832-628-138332350749912/AnsiballZ_dnf.py'
Dec 05 05:59:56 compute-0 sudo[110430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 05:59:56 compute-0 python3.9[110432]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 05:59:58 compute-0 podman[110442]: 2025-12-05 05:59:58.449321441 +0000 UTC m=+0.036804281 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 06:00:20 compute-0 podman[110584]: 2025-12-05 06:00:20.494711993 +0000 UTC m=+0.080136041 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:00:29 compute-0 podman[110664]: 2025-12-05 06:00:29.449665365 +0000 UTC m=+0.033416019 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 05 06:00:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:00:29.479 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:00:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:00:29.479 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:00:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:00:29.479 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:00:31 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 06:00:31 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 06:00:38 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 06:00:51 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 05 06:00:51 compute-0 podman[114137]: 2025-12-05 06:00:51.475408968 +0000 UTC m=+0.059851672 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:01:00 compute-0 podman[123025]: 2025-12-05 06:01:00.452994067 +0000 UTC m=+0.033413022 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 06:01:01 compute-0 CROND[124168]: (root) CMD (run-parts /etc/cron.hourly)
Dec 05 06:01:01 compute-0 run-parts[124177]: (/etc/cron.hourly) starting 0anacron
Dec 05 06:01:01 compute-0 anacron[124195]: Anacron started on 2025-12-05
Dec 05 06:01:01 compute-0 anacron[124195]: Will run job `cron.daily' in 9 min.
Dec 05 06:01:01 compute-0 anacron[124195]: Will run job `cron.weekly' in 29 min.
Dec 05 06:01:01 compute-0 anacron[124195]: Will run job `cron.monthly' in 49 min.
Dec 05 06:01:01 compute-0 anacron[124195]: Jobs will be executed sequentially
Dec 05 06:01:01 compute-0 run-parts[124199]: (/etc/cron.hourly) finished 0anacron
Dec 05 06:01:01 compute-0 CROND[124164]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 05 06:01:13 compute-0 kernel: SELinux:  Converting 2759 SID table entries...
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 05 06:01:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 05 06:01:14 compute-0 groupadd[127569]: group added to /etc/group: name=dnsmasq, GID=992
Dec 05 06:01:14 compute-0 groupadd[127569]: group added to /etc/gshadow: name=dnsmasq
Dec 05 06:01:14 compute-0 groupadd[127569]: new group: name=dnsmasq, GID=992
Dec 05 06:01:14 compute-0 useradd[127576]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 05 06:01:14 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 06:01:14 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 05 06:01:14 compute-0 dbus-broker-launch[725]: Noticed file-system modification, trigger reload.
Dec 05 06:01:15 compute-0 groupadd[127589]: group added to /etc/group: name=clevis, GID=991
Dec 05 06:01:15 compute-0 groupadd[127589]: group added to /etc/gshadow: name=clevis
Dec 05 06:01:15 compute-0 groupadd[127589]: new group: name=clevis, GID=991
Dec 05 06:01:15 compute-0 useradd[127596]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 05 06:01:15 compute-0 usermod[127606]: add 'clevis' to group 'tss'
Dec 05 06:01:15 compute-0 usermod[127606]: add 'clevis' to shadow group 'tss'
Dec 05 06:01:16 compute-0 polkitd[43679]: Reloading rules
Dec 05 06:01:16 compute-0 polkitd[43679]: Collecting garbage unconditionally...
Dec 05 06:01:16 compute-0 polkitd[43679]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 06:01:16 compute-0 polkitd[43679]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 06:01:16 compute-0 polkitd[43679]: Finished loading, compiling and executing 3 rules
Dec 05 06:01:16 compute-0 polkitd[43679]: Reloading rules
Dec 05 06:01:16 compute-0 polkitd[43679]: Collecting garbage unconditionally...
Dec 05 06:01:16 compute-0 polkitd[43679]: Loading rules from directory /etc/polkit-1/rules.d
Dec 05 06:01:16 compute-0 polkitd[43679]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 05 06:01:16 compute-0 polkitd[43679]: Finished loading, compiling and executing 3 rules
Dec 05 06:01:17 compute-0 groupadd[127793]: group added to /etc/group: name=ceph, GID=167
Dec 05 06:01:17 compute-0 groupadd[127793]: group added to /etc/gshadow: name=ceph
Dec 05 06:01:17 compute-0 groupadd[127793]: new group: name=ceph, GID=167
Dec 05 06:01:17 compute-0 useradd[127799]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 05 06:01:19 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 05 06:01:19 compute-0 sshd[962]: Received signal 15; terminating.
Dec 05 06:01:19 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 05 06:01:19 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 05 06:01:19 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 05 06:01:19 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 05 06:01:19 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 06:01:19 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 06:01:19 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 05 06:01:19 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 05 06:01:19 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 05 06:01:19 compute-0 sshd[128318]: Server listening on 0.0.0.0 port 22.
Dec 05 06:01:19 compute-0 sshd[128318]: Server listening on :: port 22.
Dec 05 06:01:19 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 05 06:01:20 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 06:01:20 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 06:01:20 compute-0 systemd[1]: Reloading.
Dec 05 06:01:20 compute-0 systemd-sysv-generator[128572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:20 compute-0 systemd-rc-local-generator[128569]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 06:01:22 compute-0 podman[131271]: 2025-12-05 06:01:22.496625629 +0000 UTC m=+0.084997956 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:01:22 compute-0 sudo[110430]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:23 compute-0 sudo[132975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkqvwvlhiiikunjekqtrayxegrodsgad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914482.8544803-652-7314764840218/AnsiballZ_systemd.py'
Dec 05 06:01:23 compute-0 sudo[132975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:23 compute-0 python3.9[133004]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 06:01:23 compute-0 systemd[1]: Reloading.
Dec 05 06:01:23 compute-0 systemd-rc-local-generator[133570]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:23 compute-0 systemd-sysv-generator[133574]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:23 compute-0 sudo[132975]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:24 compute-0 sudo[134425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmcrnscznbxamgmpercegpkrookrozdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914483.9621463-652-69187184597434/AnsiballZ_systemd.py'
Dec 05 06:01:24 compute-0 sudo[134425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:24 compute-0 python3.9[134446]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 06:01:24 compute-0 systemd[1]: Reloading.
Dec 05 06:01:24 compute-0 systemd-rc-local-generator[134956]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:24 compute-0 systemd-sysv-generator[134960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:24 compute-0 sudo[134425]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:24 compute-0 sudo[135714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyybqkiyrdqdgvaoaniygymvkjaqvtho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914484.7424307-652-258528664377696/AnsiballZ_systemd.py'
Dec 05 06:01:24 compute-0 sudo[135714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:25 compute-0 python3.9[135737]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 06:01:25 compute-0 systemd[1]: Reloading.
Dec 05 06:01:25 compute-0 systemd-rc-local-generator[136297]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:25 compute-0 systemd-sysv-generator[136303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:25 compute-0 sudo[135714]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:25 compute-0 sudo[137120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzcvwzlubrpwuzvvqlubpizykxynnmwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914485.5842798-652-18765896452275/AnsiballZ_systemd.py'
Dec 05 06:01:25 compute-0 sudo[137120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:26 compute-0 python3.9[137138]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 06:01:26 compute-0 systemd[1]: Reloading.
Dec 05 06:01:26 compute-0 systemd-sysv-generator[137657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:26 compute-0 systemd-rc-local-generator[137639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:26 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 06:01:26 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 06:01:26 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.821s CPU time.
Dec 05 06:01:26 compute-0 systemd[1]: run-r4a7bcf1561b2468eaec6535f53b77c37.service: Deactivated successfully.
Dec 05 06:01:26 compute-0 sudo[137120]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:26 compute-0 sudo[137883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjmgslknwuzihrayeksvgdaaffonrbjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914486.4886894-710-249816842648565/AnsiballZ_systemd.py'
Dec 05 06:01:26 compute-0 sudo[137883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:26 compute-0 python3.9[137885]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:26 compute-0 systemd[1]: Reloading.
Dec 05 06:01:27 compute-0 systemd-rc-local-generator[137910]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:27 compute-0 systemd-sysv-generator[137913]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:27 compute-0 sudo[137883]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:27 compute-0 sudo[138074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccmjpaxxjejcqsmrsbgjlbgymqqjkkbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914487.2557018-710-21007631410375/AnsiballZ_systemd.py'
Dec 05 06:01:27 compute-0 sudo[138074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:27 compute-0 python3.9[138076]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:27 compute-0 systemd[1]: Reloading.
Dec 05 06:01:27 compute-0 systemd-rc-local-generator[138100]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:27 compute-0 systemd-sysv-generator[138103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:27 compute-0 sudo[138074]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:28 compute-0 sudo[138264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiwnifgannihwbgsbadqymjhjvnxefrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914488.0144613-710-57439591048602/AnsiballZ_systemd.py'
Dec 05 06:01:28 compute-0 sudo[138264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:28 compute-0 python3.9[138266]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:28 compute-0 systemd[1]: Reloading.
Dec 05 06:01:28 compute-0 systemd-sysv-generator[138293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:28 compute-0 systemd-rc-local-generator[138290]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:28 compute-0 sudo[138264]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:28 compute-0 sudo[138454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxcaxebysarhmkvdeqivhopbwotnbcml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914488.753884-710-244087560800822/AnsiballZ_systemd.py'
Dec 05 06:01:28 compute-0 sudo[138454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:29 compute-0 python3.9[138456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:29 compute-0 sudo[138454]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:01:29.480 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:01:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:01:29.480 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:01:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:01:29.480 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:01:29 compute-0 sudo[138610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbhzbwfhblohpbmczgxxputnbqfjtbnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914489.337168-710-20715935728971/AnsiballZ_systemd.py'
Dec 05 06:01:29 compute-0 sudo[138610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:29 compute-0 python3.9[138612]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:29 compute-0 systemd[1]: Reloading.
Dec 05 06:01:29 compute-0 systemd-rc-local-generator[138636]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:29 compute-0 systemd-sysv-generator[138647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:30 compute-0 sudo[138610]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:30 compute-0 sudo[138810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdohgiubxfolctyzenclvupepsvevwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914490.744833-782-73350677827960/AnsiballZ_systemd.py'
Dec 05 06:01:30 compute-0 sudo[138810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:30 compute-0 podman[138774]: 2025-12-05 06:01:30.98347229 +0000 UTC m=+0.072077993 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:01:31 compute-0 python3.9[138818]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 05 06:01:31 compute-0 systemd[1]: Reloading.
Dec 05 06:01:31 compute-0 systemd-sysv-generator[138849]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:01:31 compute-0 systemd-rc-local-generator[138846]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:01:31 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 05 06:01:31 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 05 06:01:31 compute-0 sudo[138810]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:31 compute-0 sudo[139010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxxlokpmxrqwhvotmqzkbgsfzuswkmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914491.6486099-798-188270052465430/AnsiballZ_systemd.py'
Dec 05 06:01:31 compute-0 sudo[139010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:32 compute-0 python3.9[139012]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:32 compute-0 sudo[139010]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:32 compute-0 sudo[139165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkufhtggcfuzxtdrtpymakvfhywtrykd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914492.2563164-798-143147369159975/AnsiballZ_systemd.py'
Dec 05 06:01:32 compute-0 sudo[139165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:32 compute-0 python3.9[139167]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:32 compute-0 sudo[139165]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:33 compute-0 sudo[139320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgnwwijqnspiivljqkkhlurseutyxswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914492.857671-798-183036320709509/AnsiballZ_systemd.py'
Dec 05 06:01:33 compute-0 sudo[139320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:33 compute-0 python3.9[139322]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:33 compute-0 sudo[139320]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:33 compute-0 sudo[139475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmkhbkvxmpaqgolegamujcolmpmjgtjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914493.4331076-798-158344639578340/AnsiballZ_systemd.py'
Dec 05 06:01:33 compute-0 sudo[139475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:33 compute-0 python3.9[139477]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:33 compute-0 sudo[139475]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:34 compute-0 sudo[139630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwviponxlmxbbeawokfobdmpcxeizrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914494.011945-798-219115322211349/AnsiballZ_systemd.py'
Dec 05 06:01:34 compute-0 sudo[139630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:34 compute-0 python3.9[139632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:34 compute-0 sudo[139630]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:34 compute-0 sudo[139785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiebiwpyzmcogcadayfgimdihfepguwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914494.7443533-798-280940549610360/AnsiballZ_systemd.py'
Dec 05 06:01:34 compute-0 sudo[139785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:35 compute-0 python3.9[139787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:35 compute-0 sudo[139785]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:35 compute-0 sudo[139940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unozqedxazfwcjroyjelbquojymqhtsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914495.2883866-798-254097115843142/AnsiballZ_systemd.py'
Dec 05 06:01:35 compute-0 sudo[139940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:35 compute-0 python3.9[139942]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:35 compute-0 sudo[139940]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:36 compute-0 sudo[140095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytqdwjtvztsmoijnlyipveoaresddgvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914495.8282783-798-2027732701380/AnsiballZ_systemd.py'
Dec 05 06:01:36 compute-0 sudo[140095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:36 compute-0 python3.9[140097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:36 compute-0 sudo[140095]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:36 compute-0 sudo[140250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plnfskkvnmqkyxnyboanpodpniwxsmqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914496.4014566-798-195813090357613/AnsiballZ_systemd.py'
Dec 05 06:01:36 compute-0 sudo[140250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:36 compute-0 python3.9[140252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:36 compute-0 sudo[140250]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:37 compute-0 sudo[140405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqztmcjivjhvefkiiftdfufangcjblq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914496.9526484-798-80842868546522/AnsiballZ_systemd.py'
Dec 05 06:01:37 compute-0 sudo[140405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:37 compute-0 python3.9[140407]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:37 compute-0 sudo[140405]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:37 compute-0 sudo[140560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fprseconpxwtncegeloozvktbadgkgca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914497.5040305-798-164547502836869/AnsiballZ_systemd.py'
Dec 05 06:01:37 compute-0 sudo[140560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:37 compute-0 python3.9[140562]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:37 compute-0 sudo[140560]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:38 compute-0 sudo[140715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdscaqejpccfdryqnltcsaxoegkrdhdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914498.074019-798-123131036383034/AnsiballZ_systemd.py'
Dec 05 06:01:38 compute-0 sudo[140715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:38 compute-0 python3.9[140717]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:38 compute-0 sudo[140715]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:38 compute-0 sudo[140870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-honbphtfolmmrorevbwrvmbskgzelnff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914498.6958392-798-90530833188595/AnsiballZ_systemd.py'
Dec 05 06:01:38 compute-0 sudo[140870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:39 compute-0 python3.9[140872]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:39 compute-0 sudo[140870]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:39 compute-0 sudo[141025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avzzfrnfgfmixrpmwmcmnifyzcssmzpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914499.3177726-798-199271409106927/AnsiballZ_systemd.py'
Dec 05 06:01:39 compute-0 sudo[141025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:39 compute-0 python3.9[141027]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 05 06:01:39 compute-0 sudo[141025]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:40 compute-0 sudo[141180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmcpdzmzskcpzwvzsqrbjvdqxdnoolua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914500.2976825-1002-275063410538485/AnsiballZ_file.py'
Dec 05 06:01:40 compute-0 sudo[141180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:40 compute-0 python3.9[141182]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:40 compute-0 sudo[141180]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:40 compute-0 sudo[141332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkjumppszastzkpbxjgcapfvgcuiprdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914500.7293231-1002-187258860663614/AnsiballZ_file.py'
Dec 05 06:01:40 compute-0 sudo[141332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:41 compute-0 python3.9[141334]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:41 compute-0 sudo[141332]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:41 compute-0 sudo[141484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpvddpbohraomiabafhlkpusjtwdivu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914501.1580966-1002-8873204230301/AnsiballZ_file.py'
Dec 05 06:01:41 compute-0 sudo[141484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:41 compute-0 python3.9[141486]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:41 compute-0 sudo[141484]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:41 compute-0 sudo[141636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzxbnpkxgfekuqyoplpovrvtsofegvfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914501.5820959-1002-155126019748058/AnsiballZ_file.py'
Dec 05 06:01:41 compute-0 sudo[141636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:41 compute-0 python3.9[141638]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:41 compute-0 sudo[141636]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:42 compute-0 sudo[141788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obwmnaqomuqdwkmjtbyoamrdhjfksotk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914502.0232408-1002-96458064355500/AnsiballZ_file.py'
Dec 05 06:01:42 compute-0 sudo[141788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:42 compute-0 python3.9[141790]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:42 compute-0 sudo[141788]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:42 compute-0 sudo[141940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aubysargxexhkyzwwsonerbuomksyimx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914502.4461172-1002-46818264397487/AnsiballZ_file.py'
Dec 05 06:01:42 compute-0 sudo[141940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:42 compute-0 python3.9[141942]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:01:42 compute-0 sudo[141940]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:43 compute-0 sudo[142092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtcbyhstxszwravdjxttsvjqoygbkwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914503.2668512-1088-230508288298719/AnsiballZ_stat.py'
Dec 05 06:01:43 compute-0 sudo[142092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:43 compute-0 python3.9[142094]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:43 compute-0 sudo[142092]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:44 compute-0 sudo[142217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dytbnfujlxtfigoqsttwizcjvvtiboek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914503.2668512-1088-230508288298719/AnsiballZ_copy.py'
Dec 05 06:01:44 compute-0 sudo[142217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:44 compute-0 python3.9[142219]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914503.2668512-1088-230508288298719/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:44 compute-0 sudo[142217]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:44 compute-0 sudo[142369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzccajjwcjfezvzdqydbhjzfziqjbrmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914504.3658397-1088-102800276793391/AnsiballZ_stat.py'
Dec 05 06:01:44 compute-0 sudo[142369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:44 compute-0 python3.9[142371]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:44 compute-0 sudo[142369]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:44 compute-0 sudo[142494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okozrvmotzdpcmdkggekzxryhbmfwgml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914504.3658397-1088-102800276793391/AnsiballZ_copy.py'
Dec 05 06:01:44 compute-0 sudo[142494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:45 compute-0 python3.9[142496]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914504.3658397-1088-102800276793391/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:45 compute-0 sudo[142494]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:45 compute-0 sudo[142646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qojqspzwhonihektzknuiggzuodwvhum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914505.19435-1088-255637843043479/AnsiballZ_stat.py'
Dec 05 06:01:45 compute-0 sudo[142646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:45 compute-0 python3.9[142648]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:45 compute-0 sudo[142646]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:45 compute-0 sudo[142771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhuuwoakxnsuvafkieqbytlrzxjsedrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914505.19435-1088-255637843043479/AnsiballZ_copy.py'
Dec 05 06:01:45 compute-0 sudo[142771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:45 compute-0 python3.9[142773]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914505.19435-1088-255637843043479/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:45 compute-0 sudo[142771]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:46 compute-0 sudo[142923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsaroznvpljfyidphjsvopxkpusliiuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914506.0221813-1088-126307556645218/AnsiballZ_stat.py'
Dec 05 06:01:46 compute-0 sudo[142923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:46 compute-0 python3.9[142925]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:46 compute-0 sudo[142923]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:46 compute-0 sudo[143048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-holhhtibklyoidxogsepvtddqjykmttj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914506.0221813-1088-126307556645218/AnsiballZ_copy.py'
Dec 05 06:01:46 compute-0 sudo[143048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:46 compute-0 python3.9[143050]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914506.0221813-1088-126307556645218/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:46 compute-0 sudo[143048]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:47 compute-0 sudo[143200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adxopyblydrkyzjdvmwjrdpprfyyehgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914506.8659098-1088-253349631515242/AnsiballZ_stat.py'
Dec 05 06:01:47 compute-0 sudo[143200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:47 compute-0 python3.9[143202]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:47 compute-0 sudo[143200]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:47 compute-0 sudo[143325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ledsypljmytftaiuwougtcydciougmsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914506.8659098-1088-253349631515242/AnsiballZ_copy.py'
Dec 05 06:01:47 compute-0 sudo[143325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:47 compute-0 python3.9[143327]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914506.8659098-1088-253349631515242/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:47 compute-0 sudo[143325]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:48 compute-0 sudo[143477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybvpvtzckdappdimihvsldptnmzriftw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914507.7088122-1088-67788713219540/AnsiballZ_stat.py'
Dec 05 06:01:48 compute-0 sudo[143477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:48 compute-0 python3.9[143479]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:48 compute-0 sudo[143477]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:48 compute-0 sudo[143602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqkyxghucdpjytseogljfqtltwrprix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914507.7088122-1088-67788713219540/AnsiballZ_copy.py'
Dec 05 06:01:48 compute-0 sudo[143602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:48 compute-0 python3.9[143604]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914507.7088122-1088-67788713219540/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:48 compute-0 sudo[143602]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:48 compute-0 sudo[143754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knuyausfswgyamwlaikhwlzjfmmezthf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914508.6902406-1088-141423354817570/AnsiballZ_stat.py'
Dec 05 06:01:48 compute-0 sudo[143754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:49 compute-0 python3.9[143756]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:49 compute-0 sudo[143754]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:49 compute-0 sudo[143877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgrxizkkydymtvuwyrrneocfjfgtogxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914508.6902406-1088-141423354817570/AnsiballZ_copy.py'
Dec 05 06:01:49 compute-0 sudo[143877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:49 compute-0 python3.9[143879]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914508.6902406-1088-141423354817570/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:49 compute-0 sudo[143877]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:49 compute-0 sudo[144029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqsznsjscmprbengftgbpxmnuczzmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914509.5047708-1088-112679887118398/AnsiballZ_stat.py'
Dec 05 06:01:49 compute-0 sudo[144029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:49 compute-0 python3.9[144031]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:49 compute-0 sudo[144029]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:50 compute-0 sudo[144154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiqqmwcjuahzlixcgfvyhmuvbicwfqmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914509.5047708-1088-112679887118398/AnsiballZ_copy.py'
Dec 05 06:01:50 compute-0 sudo[144154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:50 compute-0 python3.9[144156]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764914509.5047708-1088-112679887118398/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:50 compute-0 sudo[144154]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:50 compute-0 sudo[144306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyeboehovhplcaxuisuaijtjnukbfbui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914510.4167302-1314-212820287320587/AnsiballZ_command.py'
Dec 05 06:01:50 compute-0 sudo[144306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:50 compute-0 python3.9[144308]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 05 06:01:50 compute-0 sudo[144306]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:51 compute-0 sudo[144459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgkcwklzbikevjuscscpszzbqqaitepu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914510.947862-1332-192059723770001/AnsiballZ_file.py'
Dec 05 06:01:51 compute-0 sudo[144459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:51 compute-0 python3.9[144461]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:51 compute-0 sudo[144459]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:51 compute-0 sudo[144611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwlsmargbqekfqtfmjeclmmwpzheseak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914511.3788805-1332-216974438424930/AnsiballZ_file.py'
Dec 05 06:01:51 compute-0 sudo[144611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:51 compute-0 python3.9[144613]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:51 compute-0 sudo[144611]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:52 compute-0 sudo[144763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiqegijwtnfcdzejhhghmdlgxwqfdhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914511.8286884-1332-104704446400471/AnsiballZ_file.py'
Dec 05 06:01:52 compute-0 sudo[144763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:52 compute-0 python3.9[144765]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:52 compute-0 sudo[144763]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:52 compute-0 sudo[144915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpibswtcboqrwtzhnhwuigawaimpgcic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914512.258762-1332-4906783258972/AnsiballZ_file.py'
Dec 05 06:01:52 compute-0 sudo[144915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:52 compute-0 python3.9[144917]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:52 compute-0 sudo[144915]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:52 compute-0 sudo[145076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjqzkdowtbnojgwoogmlfxlerxqspugy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914512.8013396-1332-240847209714739/AnsiballZ_file.py'
Dec 05 06:01:52 compute-0 sudo[145076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:53 compute-0 podman[145041]: 2025-12-05 06:01:53.02071932 +0000 UTC m=+0.058360627 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:01:53 compute-0 python3.9[145082]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:53 compute-0 sudo[145076]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:53 compute-0 sudo[145242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdtkmkoadttxowqkeiqdeljbrezrpxch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914513.2618961-1332-19607934607367/AnsiballZ_file.py'
Dec 05 06:01:53 compute-0 sudo[145242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:53 compute-0 python3.9[145244]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:53 compute-0 sudo[145242]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:53 compute-0 sudo[145394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-covoutyysthpjinekkpnsbhuxrppntuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914513.698878-1332-60718631275855/AnsiballZ_file.py'
Dec 05 06:01:53 compute-0 sudo[145394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:54 compute-0 python3.9[145396]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:54 compute-0 sudo[145394]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:54 compute-0 sudo[145546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvtaaklvcsgekzajtwzoazmjvdmiwkwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914514.1257167-1332-100467781041316/AnsiballZ_file.py'
Dec 05 06:01:54 compute-0 sudo[145546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:54 compute-0 python3.9[145548]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:54 compute-0 sudo[145546]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:54 compute-0 sudo[145698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpuyvostklllnwreurrstfevspwrhfab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914514.5498943-1332-86502363251280/AnsiballZ_file.py'
Dec 05 06:01:54 compute-0 sudo[145698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:54 compute-0 python3.9[145700]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:54 compute-0 sudo[145698]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:55 compute-0 sudo[145850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghfmwcttdbvvfncgmtsyesitkopctxhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914514.9698107-1332-93548656818409/AnsiballZ_file.py'
Dec 05 06:01:55 compute-0 sudo[145850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:55 compute-0 python3.9[145852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:55 compute-0 sudo[145850]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:55 compute-0 sudo[146002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yypvduasrctwzwezknrsnzsbkyephekn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914515.3920307-1332-52230941470290/AnsiballZ_file.py'
Dec 05 06:01:55 compute-0 sudo[146002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:55 compute-0 python3.9[146004]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:55 compute-0 sudo[146002]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:55 compute-0 sudo[146154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpczwfxyemetuqlvqntkgvjyxlpoqyru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914515.8142788-1332-55313446725952/AnsiballZ_file.py'
Dec 05 06:01:55 compute-0 sudo[146154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:56 compute-0 python3.9[146156]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:56 compute-0 sudo[146154]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:56 compute-0 sudo[146306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chnnadeqcjadpskyonqtdpsyzejaamyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914516.254812-1332-185336407485819/AnsiballZ_file.py'
Dec 05 06:01:56 compute-0 sudo[146306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:56 compute-0 python3.9[146308]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:56 compute-0 sudo[146306]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:56 compute-0 sudo[146458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bccuiobjfuenghxczastafldzgydivxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914516.683167-1332-65628858307650/AnsiballZ_file.py'
Dec 05 06:01:56 compute-0 sudo[146458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:57 compute-0 python3.9[146460]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:57 compute-0 sudo[146458]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:57 compute-0 sudo[146610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmektielbkqgpxijlvnbcehrrwhuhpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914517.4507792-1530-62898784973260/AnsiballZ_stat.py'
Dec 05 06:01:57 compute-0 sudo[146610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:57 compute-0 python3.9[146612]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:57 compute-0 sudo[146610]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:58 compute-0 sudo[146733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeqxnyherpzibegvoigsgaydgngyctgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914517.4507792-1530-62898784973260/AnsiballZ_copy.py'
Dec 05 06:01:58 compute-0 sudo[146733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:58 compute-0 python3.9[146735]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914517.4507792-1530-62898784973260/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:58 compute-0 sudo[146733]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:58 compute-0 sudo[146885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqgdyaigixgooficsycbwxeqrrrxcfmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914518.2875128-1530-211851476214489/AnsiballZ_stat.py'
Dec 05 06:01:58 compute-0 sudo[146885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:58 compute-0 python3.9[146887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:58 compute-0 sudo[146885]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:58 compute-0 sudo[147008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcunfzrxbiiilierktubzdifehhepver ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914518.2875128-1530-211851476214489/AnsiballZ_copy.py'
Dec 05 06:01:58 compute-0 sudo[147008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:58 compute-0 python3.9[147010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914518.2875128-1530-211851476214489/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:59 compute-0 sudo[147008]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:59 compute-0 sudo[147160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saeyuuayruxpshkxzkorswzlcidwjrfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914519.0954747-1530-203444056349755/AnsiballZ_stat.py'
Dec 05 06:01:59 compute-0 sudo[147160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:59 compute-0 python3.9[147162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:01:59 compute-0 sudo[147160]: pam_unix(sudo:session): session closed for user root
Dec 05 06:01:59 compute-0 sudo[147283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkasiamtowxjkkiasdtttxicyihfhhso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914519.0954747-1530-203444056349755/AnsiballZ_copy.py'
Dec 05 06:01:59 compute-0 sudo[147283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:01:59 compute-0 python3.9[147285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914519.0954747-1530-203444056349755/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:01:59 compute-0 sudo[147283]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:00 compute-0 sudo[147435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfqrilzysvlunkmoumdmeffovuprqogd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914519.9587967-1530-104574290286730/AnsiballZ_stat.py'
Dec 05 06:02:00 compute-0 sudo[147435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:00 compute-0 python3.9[147437]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:00 compute-0 sudo[147435]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:00 compute-0 sudo[147558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylsnwlchrawdsnnwndfvmknxivwxrate ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914519.9587967-1530-104574290286730/AnsiballZ_copy.py'
Dec 05 06:02:00 compute-0 sudo[147558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:00 compute-0 python3.9[147560]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914519.9587967-1530-104574290286730/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:00 compute-0 sudo[147558]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:01 compute-0 sudo[147719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjbjireimfqgwifsdrevatuwuektgzxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914520.8152971-1530-73188406126188/AnsiballZ_stat.py'
Dec 05 06:02:01 compute-0 sudo[147719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:01 compute-0 podman[147684]: 2025-12-05 06:02:01.151499905 +0000 UTC m=+0.039471227 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:02:01 compute-0 python3.9[147728]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:01 compute-0 sudo[147719]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:01 compute-0 sudo[147849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdzunmdujmvbkklfqlofbpqgrvojctko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914520.8152971-1530-73188406126188/AnsiballZ_copy.py'
Dec 05 06:02:01 compute-0 sudo[147849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:01 compute-0 python3.9[147851]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914520.8152971-1530-73188406126188/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:01 compute-0 sudo[147849]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:01 compute-0 sudo[148001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wixqkvggsdoyjhwjulxwplxchoxxvzlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914521.7768273-1530-49575113527674/AnsiballZ_stat.py'
Dec 05 06:02:01 compute-0 sudo[148001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:02 compute-0 python3.9[148003]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:02 compute-0 sudo[148001]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:02 compute-0 sudo[148124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyajnexbtammewywmidtkleoochkuczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914521.7768273-1530-49575113527674/AnsiballZ_copy.py'
Dec 05 06:02:02 compute-0 sudo[148124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:02 compute-0 python3.9[148126]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914521.7768273-1530-49575113527674/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:02 compute-0 sudo[148124]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:02 compute-0 sudo[148276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttpnpsggjiqzpyrsxeuezurbdrahtzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914522.5866616-1530-215906105355182/AnsiballZ_stat.py'
Dec 05 06:02:02 compute-0 sudo[148276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:02 compute-0 python3.9[148278]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:02 compute-0 sudo[148276]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:03 compute-0 sudo[148399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywnqvqyeinlarghqeddwbsmulpzglqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914522.5866616-1530-215906105355182/AnsiballZ_copy.py'
Dec 05 06:02:03 compute-0 sudo[148399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:03 compute-0 python3.9[148401]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914522.5866616-1530-215906105355182/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:03 compute-0 sudo[148399]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:03 compute-0 sudo[148551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwrphezncrajqfebmlnqucyhxjafiaqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914523.4136078-1530-174852409517042/AnsiballZ_stat.py'
Dec 05 06:02:03 compute-0 sudo[148551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:03 compute-0 python3.9[148553]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:03 compute-0 sudo[148551]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:03 compute-0 sudo[148674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkhdjhhluprzsxlsuqeianoqxidoalkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914523.4136078-1530-174852409517042/AnsiballZ_copy.py'
Dec 05 06:02:03 compute-0 sudo[148674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:04 compute-0 python3.9[148676]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914523.4136078-1530-174852409517042/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:04 compute-0 sudo[148674]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:04 compute-0 sudo[148826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-texuxuhrcdpxbfyjquicuzxsgpwipczt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914524.2131133-1530-232208824285787/AnsiballZ_stat.py'
Dec 05 06:02:04 compute-0 sudo[148826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:04 compute-0 python3.9[148828]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:04 compute-0 sudo[148826]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:04 compute-0 sudo[148949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxastdczlahlcjspigyysgcdxrrnova ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914524.2131133-1530-232208824285787/AnsiballZ_copy.py'
Dec 05 06:02:04 compute-0 sudo[148949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:04 compute-0 python3.9[148951]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914524.2131133-1530-232208824285787/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:04 compute-0 sudo[148949]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:05 compute-0 sudo[149101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnzmrzosigqyeingbucobbeybbaqhnny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914525.020755-1530-127697216656754/AnsiballZ_stat.py'
Dec 05 06:02:05 compute-0 sudo[149101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:05 compute-0 python3.9[149103]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:05 compute-0 sudo[149101]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:05 compute-0 sudo[149224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rutpzpcmqevkqjruiahhhbkkjdicxjfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914525.020755-1530-127697216656754/AnsiballZ_copy.py'
Dec 05 06:02:05 compute-0 sudo[149224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:05 compute-0 python3.9[149226]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914525.020755-1530-127697216656754/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:05 compute-0 sudo[149224]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:06 compute-0 sudo[149376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knodaiogqfzihhfprllywlibuqbjxuiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914525.8456342-1530-107138454869601/AnsiballZ_stat.py'
Dec 05 06:02:06 compute-0 sudo[149376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:06 compute-0 python3.9[149378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:06 compute-0 sudo[149376]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:06 compute-0 sudo[149499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytmlfsozdxixvsxhtpxkgrrsaylcpjop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914525.8456342-1530-107138454869601/AnsiballZ_copy.py'
Dec 05 06:02:06 compute-0 sudo[149499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:06 compute-0 python3.9[149501]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914525.8456342-1530-107138454869601/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:06 compute-0 sudo[149499]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:06 compute-0 sudo[149651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbtzcgholcbcthykwxhpilzmruuwalef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914526.6621053-1530-154117533583163/AnsiballZ_stat.py'
Dec 05 06:02:06 compute-0 sudo[149651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:06 compute-0 python3.9[149653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:07 compute-0 sudo[149651]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:07 compute-0 sudo[149774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnvtjdbekbiyncciqiltvdjlkufyeyhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914526.6621053-1530-154117533583163/AnsiballZ_copy.py'
Dec 05 06:02:07 compute-0 sudo[149774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:07 compute-0 python3.9[149776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914526.6621053-1530-154117533583163/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:07 compute-0 sudo[149774]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:07 compute-0 sudo[149926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdtucvmcfomabpieqhzjtbfybmbdnuem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914527.4955797-1530-92163468739001/AnsiballZ_stat.py'
Dec 05 06:02:07 compute-0 sudo[149926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:07 compute-0 python3.9[149928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:07 compute-0 sudo[149926]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:08 compute-0 sudo[150049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxnhuljehcigrojshpxywntpqjizixaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914527.4955797-1530-92163468739001/AnsiballZ_copy.py'
Dec 05 06:02:08 compute-0 sudo[150049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:08 compute-0 python3.9[150051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914527.4955797-1530-92163468739001/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:08 compute-0 sudo[150049]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:08 compute-0 sudo[150201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbfaxnzgyfnwpkhtdhdlletcpzoarttg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914528.3023384-1530-71853859993927/AnsiballZ_stat.py'
Dec 05 06:02:08 compute-0 sudo[150201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:08 compute-0 python3.9[150203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:08 compute-0 sudo[150201]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:08 compute-0 sudo[150324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpemdrtypirpdvsrwevbsfjnisdgkolk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914528.3023384-1530-71853859993927/AnsiballZ_copy.py'
Dec 05 06:02:08 compute-0 sudo[150324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:09 compute-0 python3.9[150326]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914528.3023384-1530-71853859993927/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:09 compute-0 sudo[150324]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:09 compute-0 python3.9[150476]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:09 compute-0 sudo[150629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsaxcdkicvesgqhzospbwusjerhwlleu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914529.6441405-1942-61360403049140/AnsiballZ_seboolean.py'
Dec 05 06:02:09 compute-0 sudo[150629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:10 compute-0 python3.9[150631]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 05 06:02:10 compute-0 sudo[150629]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:11 compute-0 sudo[150785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muizlmgfmespqgssczhznekabxyltvim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914531.0079503-1958-257686018228825/AnsiballZ_copy.py'
Dec 05 06:02:11 compute-0 dbus-broker-launch[732]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 05 06:02:11 compute-0 sudo[150785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:11 compute-0 python3.9[150787]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:11 compute-0 sudo[150785]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:11 compute-0 sudo[150937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-strdirzpsjmwzdcqonwbryrdkiyehduk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914531.438839-1958-62992652615828/AnsiballZ_copy.py'
Dec 05 06:02:11 compute-0 sudo[150937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:11 compute-0 python3.9[150939]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:11 compute-0 sudo[150937]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:12 compute-0 sudo[151089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziwthigdevhbyslggrvoooavoirephcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914531.8587005-1958-24170046282072/AnsiballZ_copy.py'
Dec 05 06:02:12 compute-0 sudo[151089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:12 compute-0 python3.9[151091]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:12 compute-0 sudo[151089]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:12 compute-0 sudo[151241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfuwduxroklkiqggycirdxdufpodbudg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914532.2814167-1958-128940068366185/AnsiballZ_copy.py'
Dec 05 06:02:12 compute-0 sudo[151241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:12 compute-0 python3.9[151243]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:12 compute-0 sudo[151241]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:12 compute-0 sudo[151393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqynbzmvjpbwmjdtijamzbnipvvnmqvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914532.7157757-1958-265233436978675/AnsiballZ_copy.py'
Dec 05 06:02:12 compute-0 sudo[151393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:13 compute-0 python3.9[151395]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:13 compute-0 sudo[151393]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:13 compute-0 sudo[151545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlzaawudhrccombsyuhbncpjdcjgwufk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914533.208602-2030-149947219685949/AnsiballZ_copy.py'
Dec 05 06:02:13 compute-0 sudo[151545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:13 compute-0 python3.9[151547]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:13 compute-0 sudo[151545]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:13 compute-0 sudo[151697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bacrwgyrdccfrtmnndvlioikipoglxwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914533.6462717-2030-14141364871837/AnsiballZ_copy.py'
Dec 05 06:02:13 compute-0 sudo[151697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:13 compute-0 python3.9[151699]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:13 compute-0 sudo[151697]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:14 compute-0 sudo[151849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxyltpsjddqffeiwmbfkivtyawpefebz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914534.058431-2030-156552248369872/AnsiballZ_copy.py'
Dec 05 06:02:14 compute-0 sudo[151849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:14 compute-0 python3.9[151851]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:14 compute-0 sudo[151849]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:14 compute-0 sudo[152001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sudkpjxxojcblkadxwikeqmkjbokgnzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914534.4808755-2030-166120587264446/AnsiballZ_copy.py'
Dec 05 06:02:14 compute-0 sudo[152001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:14 compute-0 python3.9[152003]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:14 compute-0 sudo[152001]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:15 compute-0 sudo[152153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpymvepvnjzxxwpjudhnxddgmubjvhpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914534.8966594-2030-231358806950117/AnsiballZ_copy.py'
Dec 05 06:02:15 compute-0 sudo[152153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:15 compute-0 python3.9[152155]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:15 compute-0 sudo[152153]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:15 compute-0 sudo[152305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhauxnpgykwyprpzeqmrcuzsitdozgjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914535.3778894-2102-73154360327632/AnsiballZ_systemd.py'
Dec 05 06:02:15 compute-0 sudo[152305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:15 compute-0 python3.9[152307]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:02:15 compute-0 systemd[1]: Reloading.
Dec 05 06:02:15 compute-0 systemd-rc-local-generator[152327]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:15 compute-0 systemd-sysv-generator[152331]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 05 06:02:16 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 05 06:02:16 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 05 06:02:16 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 05 06:02:16 compute-0 sudo[152305]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:16 compute-0 sudo[152498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsukckecaquuphjelchbtduxezmuqhsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914536.1905808-2102-121599528268211/AnsiballZ_systemd.py'
Dec 05 06:02:16 compute-0 sudo[152498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:16 compute-0 python3.9[152500]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:02:16 compute-0 systemd[1]: Reloading.
Dec 05 06:02:16 compute-0 systemd-rc-local-generator[152522]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:16 compute-0 systemd-sysv-generator[152527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 05 06:02:16 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 05 06:02:16 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 05 06:02:16 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 05 06:02:16 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 06:02:16 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 05 06:02:16 compute-0 sudo[152498]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:17 compute-0 sudo[152715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ageahbqolwoydzgwmrxcngjxbadwzely ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914536.9680443-2102-199015618047698/AnsiballZ_systemd.py'
Dec 05 06:02:17 compute-0 sudo[152715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:17 compute-0 python3.9[152717]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:02:17 compute-0 systemd[1]: Reloading.
Dec 05 06:02:17 compute-0 systemd-sysv-generator[152741]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:17 compute-0 systemd-rc-local-generator[152738]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:17 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 05 06:02:17 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 05 06:02:17 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 05 06:02:17 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 05 06:02:17 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 06:02:17 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 06:02:17 compute-0 sudo[152715]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:17 compute-0 sudo[152925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puslloehfmazccvqprpfomqavalomreq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914537.7520316-2102-275697001682916/AnsiballZ_systemd.py'
Dec 05 06:02:17 compute-0 sudo[152925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:18 compute-0 python3.9[152927]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:02:18 compute-0 systemd[1]: Reloading.
Dec 05 06:02:18 compute-0 systemd-sysv-generator[152950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:18 compute-0 systemd-rc-local-generator[152947]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:18 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 05 06:02:18 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 05 06:02:18 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 05 06:02:18 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 05 06:02:18 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 05 06:02:18 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 05 06:02:18 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 05 06:02:18 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 05 06:02:18 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 05 06:02:18 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 05 06:02:18 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 05 06:02:18 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 06:02:18 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 05 06:02:18 compute-0 sudo[152925]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:18 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 05 06:02:18 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 05 06:02:18 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 05 06:02:18 compute-0 sudo[153147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuhsjjywktyaosgvybletqokwpshacwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914538.5688384-2102-62427793959675/AnsiballZ_systemd.py'
Dec 05 06:02:18 compute-0 sudo[153147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:19 compute-0 python3.9[153149]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:02:19 compute-0 systemd[1]: Reloading.
Dec 05 06:02:19 compute-0 systemd-sysv-generator[153175]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:19 compute-0 systemd-rc-local-generator[153172]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:19 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 05 06:02:19 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 05 06:02:19 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 05 06:02:19 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 05 06:02:19 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 05 06:02:19 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 05 06:02:19 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 05 06:02:19 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 05 06:02:19 compute-0 sudo[153147]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:19 compute-0 setroubleshoot[152963]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be9ba18b-1a79-4c5f-a09c-3e839e1ed4eb
Dec 05 06:02:19 compute-0 setroubleshoot[152963]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 06:02:19 compute-0 setroubleshoot[152963]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be9ba18b-1a79-4c5f-a09c-3e839e1ed4eb
Dec 05 06:02:19 compute-0 setroubleshoot[152963]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 05 06:02:19 compute-0 sudo[153362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebrccsbplxeqekqfpeumwzxyqjmsgcak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914539.6111515-2176-48074326690254/AnsiballZ_file.py'
Dec 05 06:02:19 compute-0 sudo[153362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:19 compute-0 python3.9[153364]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:19 compute-0 sudo[153362]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:20 compute-0 sudo[153514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imvkeaufbsdybpgehlwvoxoztkhhdfuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914540.102995-2192-93221736721531/AnsiballZ_find.py'
Dec 05 06:02:20 compute-0 sudo[153514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:20 compute-0 python3.9[153516]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 06:02:20 compute-0 sudo[153514]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:20 compute-0 sudo[153666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvxazztompyabnedsgytqzeeqwmxfybq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914540.759493-2220-158363039667669/AnsiballZ_stat.py'
Dec 05 06:02:20 compute-0 sudo[153666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:21 compute-0 python3.9[153668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:21 compute-0 sudo[153666]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:21 compute-0 sudo[153789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syxizvtfmsvapggmdwqnsylggtqrblcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914540.759493-2220-158363039667669/AnsiballZ_copy.py'
Dec 05 06:02:21 compute-0 sudo[153789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:21 compute-0 python3.9[153791]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914540.759493-2220-158363039667669/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:21 compute-0 sudo[153789]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:21 compute-0 sudo[153941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsqshyihofeqivvqueebsprsrujswhzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914541.7706509-2252-96563240348507/AnsiballZ_file.py'
Dec 05 06:02:21 compute-0 sudo[153941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:22 compute-0 python3.9[153943]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:22 compute-0 sudo[153941]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:22 compute-0 sudo[154094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvruuftkllyagtogqjgmcnchpyxlaols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914542.2729077-2268-159680811452133/AnsiballZ_stat.py'
Dec 05 06:02:22 compute-0 sudo[154094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:22 compute-0 python3.9[154096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:22 compute-0 sudo[154094]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:22 compute-0 sudo[154172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjffhgshmdgrtsqlritiuyiobwazugxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914542.2729077-2268-159680811452133/AnsiballZ_file.py'
Dec 05 06:02:22 compute-0 sudo[154172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:22 compute-0 python3.9[154174]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:22 compute-0 sudo[154172]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:23 compute-0 sudo[154332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tprdcfwxgwqqpbpjdvlbxuhnjhspxvew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914543.0712972-2292-42042032871164/AnsiballZ_stat.py'
Dec 05 06:02:23 compute-0 sudo[154332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:23 compute-0 podman[154298]: 2025-12-05 06:02:23.297576294 +0000 UTC m=+0.062777272 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller)
Dec 05 06:02:23 compute-0 python3.9[154342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:23 compute-0 sudo[154332]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:23 compute-0 sudo[154426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcnitacsupyoujrbiqtdzrtvrvvvdpfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914543.0712972-2292-42042032871164/AnsiballZ_file.py'
Dec 05 06:02:23 compute-0 sudo[154426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:23 compute-0 python3.9[154428]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7wnljzd2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:23 compute-0 sudo[154426]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:24 compute-0 sudo[154578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwowljlpucwgutlulnrxvrkpnyqbofja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914543.8561277-2316-215046192390889/AnsiballZ_stat.py'
Dec 05 06:02:24 compute-0 sudo[154578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:24 compute-0 python3.9[154580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:24 compute-0 sudo[154578]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:24 compute-0 sudo[154656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilosvzejaqomieyxjosacelsuwhlfies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914543.8561277-2316-215046192390889/AnsiballZ_file.py'
Dec 05 06:02:24 compute-0 sudo[154656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:24 compute-0 python3.9[154658]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:24 compute-0 sudo[154656]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:24 compute-0 sudo[154808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbacvrneybvznootpebjgnasmihuvixo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914544.7246926-2342-231401558278322/AnsiballZ_command.py'
Dec 05 06:02:24 compute-0 sudo[154808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:25 compute-0 python3.9[154810]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:25 compute-0 sudo[154808]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:25 compute-0 sudo[154961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uejylepawesruhggehrcwhbhrdccyuxc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914545.2239473-2358-99957991712901/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 06:02:25 compute-0 sudo[154961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:25 compute-0 python3[154963]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 06:02:25 compute-0 sudo[154961]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:26 compute-0 sudo[155113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkxpqbwoiyqdjxtcppbhaccgtcicuuhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914545.8116102-2374-95658534987043/AnsiballZ_stat.py'
Dec 05 06:02:26 compute-0 sudo[155113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:26 compute-0 python3.9[155115]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:26 compute-0 sudo[155113]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:26 compute-0 sudo[155191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkcnydwtbeofchtcgnlrphyssphmjrfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914545.8116102-2374-95658534987043/AnsiballZ_file.py'
Dec 05 06:02:26 compute-0 sudo[155191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:26 compute-0 python3.9[155193]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:26 compute-0 sudo[155191]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:26 compute-0 sudo[155343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzrbemyspgqelldxfobnbouusbigbcff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914546.6285203-2398-93247851959544/AnsiballZ_stat.py'
Dec 05 06:02:26 compute-0 sudo[155343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:26 compute-0 python3.9[155345]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:27 compute-0 sudo[155343]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:27 compute-0 sudo[155421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmvozthckdmxpqvdlgikliugrplfvbae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914546.6285203-2398-93247851959544/AnsiballZ_file.py'
Dec 05 06:02:27 compute-0 sudo[155421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:27 compute-0 python3.9[155423]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:27 compute-0 sudo[155421]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:27 compute-0 sudo[155573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrboohxwhcmdplhukkcfulsggzvnxcat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914547.4958308-2422-162256581813985/AnsiballZ_stat.py'
Dec 05 06:02:27 compute-0 sudo[155573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:27 compute-0 python3.9[155575]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:27 compute-0 sudo[155573]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:28 compute-0 sudo[155651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-echmqkciyyabggolgrfoepdmhoxsdcbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914547.4958308-2422-162256581813985/AnsiballZ_file.py'
Dec 05 06:02:28 compute-0 sudo[155651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:28 compute-0 python3.9[155653]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:28 compute-0 sudo[155651]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:28 compute-0 sudo[155803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uodregrztecilmdthgrkwvufekalvfpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914548.3098922-2446-162828383475382/AnsiballZ_stat.py'
Dec 05 06:02:28 compute-0 sudo[155803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:28 compute-0 python3.9[155805]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:28 compute-0 sudo[155803]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:28 compute-0 sudo[155881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbsczjkrrtgiavsdnjpbizttigcrnhuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914548.3098922-2446-162828383475382/AnsiballZ_file.py'
Dec 05 06:02:28 compute-0 sudo[155881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:28 compute-0 python3.9[155883]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:29 compute-0 sudo[155881]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:29 compute-0 sudo[156033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjzudhogzuftabsofesbunjiebkaaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914549.1373756-2470-267678491862264/AnsiballZ_stat.py'
Dec 05 06:02:29 compute-0 sudo[156033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:02:29.481 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:02:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:02:29.481 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:02:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:02:29.481 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:02:29 compute-0 python3.9[156035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:29 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 05 06:02:29 compute-0 sudo[156033]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:29 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 05 06:02:29 compute-0 sudo[156159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvnxazmsgrbcnlusytbbzeuihzmfyvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914549.1373756-2470-267678491862264/AnsiballZ_copy.py'
Dec 05 06:02:29 compute-0 sudo[156159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:29 compute-0 python3.9[156161]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914549.1373756-2470-267678491862264/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:29 compute-0 sudo[156159]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:30 compute-0 sudo[156311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lohkxqjnecaeeowvxzwafnwzdvovkwuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914550.1168523-2500-57779611333048/AnsiballZ_file.py'
Dec 05 06:02:30 compute-0 sudo[156311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:30 compute-0 python3.9[156313]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:30 compute-0 sudo[156311]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:30 compute-0 sudo[156463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suqzajtcmxehnlkljwgfvhzytuqxtsin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914550.6064603-2516-274770595531809/AnsiballZ_command.py'
Dec 05 06:02:30 compute-0 sudo[156463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:30 compute-0 python3.9[156465]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:30 compute-0 sudo[156463]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:31 compute-0 sudo[156628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nynjpdjpxumuhumvavvymqnhpdnbdbaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914551.0835338-2532-248399340936729/AnsiballZ_blockinfile.py'
Dec 05 06:02:31 compute-0 sudo[156628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:31 compute-0 podman[156592]: 2025-12-05 06:02:31.424362303 +0000 UTC m=+0.042263512 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 05 06:02:31 compute-0 python3.9[156636]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:31 compute-0 sudo[156628]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:31 compute-0 sudo[156787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yppxxkmmansjefygqudukfdnbhyvkgue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914551.789818-2550-34349494698753/AnsiballZ_command.py'
Dec 05 06:02:31 compute-0 sudo[156787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:32 compute-0 python3.9[156789]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:32 compute-0 sudo[156787]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:32 compute-0 sudo[156940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbnkcqlrfakdapbstumvtomdhuylsqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914552.3089595-2566-49224788728260/AnsiballZ_stat.py'
Dec 05 06:02:32 compute-0 sudo[156940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:32 compute-0 python3.9[156942]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:02:32 compute-0 sudo[156940]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:32 compute-0 sudo[157094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elnvabntdwgcimqwhtphsnzuyyzavkxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914552.7694168-2582-170842054489746/AnsiballZ_command.py'
Dec 05 06:02:32 compute-0 sudo[157094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:33 compute-0 python3.9[157096]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:33 compute-0 sudo[157094]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:33 compute-0 sudo[157249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzqokxzeppjfoortkfyqqzghaegcjgvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914553.260318-2598-249868364227899/AnsiballZ_file.py'
Dec 05 06:02:33 compute-0 sudo[157249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:33 compute-0 python3.9[157251]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:33 compute-0 sudo[157249]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:33 compute-0 sudo[157401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxavdhyeszkwzelfkeowjyihxakduypr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914553.7197878-2614-127812139191167/AnsiballZ_stat.py'
Dec 05 06:02:33 compute-0 sudo[157401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:34 compute-0 python3.9[157403]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:34 compute-0 sudo[157401]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:34 compute-0 sudo[157524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdhzmrgvpsbdegsvgazdvanukthybxcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914553.7197878-2614-127812139191167/AnsiballZ_copy.py'
Dec 05 06:02:34 compute-0 sudo[157524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:34 compute-0 python3.9[157526]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914553.7197878-2614-127812139191167/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:34 compute-0 sudo[157524]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:34 compute-0 sudo[157676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhuudpjuvkfwmgqobweulwkvqztabiko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914554.581862-2644-186381356378368/AnsiballZ_stat.py'
Dec 05 06:02:34 compute-0 sudo[157676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:34 compute-0 python3.9[157678]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:34 compute-0 sudo[157676]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:35 compute-0 sudo[157799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaildgsgzpbmsfpxvivklepwbinvyqne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914554.581862-2644-186381356378368/AnsiballZ_copy.py'
Dec 05 06:02:35 compute-0 sudo[157799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:35 compute-0 python3.9[157801]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914554.581862-2644-186381356378368/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:35 compute-0 sudo[157799]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:35 compute-0 sudo[157951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiedokvgmwosthtgeutihhcgvckgimnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914555.4616344-2674-120961388258544/AnsiballZ_stat.py'
Dec 05 06:02:35 compute-0 sudo[157951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:35 compute-0 python3.9[157953]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:35 compute-0 sudo[157951]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:36 compute-0 sudo[158074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvwjypxwbccrnodhoqxlbgpptdyhvfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914555.4616344-2674-120961388258544/AnsiballZ_copy.py'
Dec 05 06:02:36 compute-0 sudo[158074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:36 compute-0 python3.9[158076]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914555.4616344-2674-120961388258544/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:36 compute-0 sudo[158074]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:36 compute-0 sudo[158226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atscuumieznspjbqzszpaikqnkhiqswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914556.393056-2704-220586221172849/AnsiballZ_systemd.py'
Dec 05 06:02:36 compute-0 sudo[158226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:36 compute-0 python3.9[158228]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:02:36 compute-0 systemd[1]: Reloading.
Dec 05 06:02:36 compute-0 systemd-sysv-generator[158252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:36 compute-0 systemd-rc-local-generator[158248]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:37 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 05 06:02:37 compute-0 sudo[158226]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:37 compute-0 sudo[158417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggmnvlsjekbzdmagyswalubhbatcriqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914557.25136-2720-226542384912036/AnsiballZ_systemd.py'
Dec 05 06:02:37 compute-0 sudo[158417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:37 compute-0 python3.9[158419]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 05 06:02:37 compute-0 systemd[1]: Reloading.
Dec 05 06:02:37 compute-0 systemd-sysv-generator[158447]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:37 compute-0 systemd-rc-local-generator[158444]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:37 compute-0 systemd[1]: Reloading.
Dec 05 06:02:37 compute-0 systemd-rc-local-generator[158478]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:37 compute-0 systemd-sysv-generator[158481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:38 compute-0 sudo[158417]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:38 compute-0 sshd-session[104161]: Connection closed by 192.168.122.30 port 48320
Dec 05 06:02:38 compute-0 sshd-session[104158]: pam_unix(sshd:session): session closed for user zuul
Dec 05 06:02:38 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Dec 05 06:02:38 compute-0 systemd[1]: session-22.scope: Consumed 2min 19.822s CPU time.
Dec 05 06:02:38 compute-0 systemd-logind[745]: Session 22 logged out. Waiting for processes to exit.
Dec 05 06:02:38 compute-0 systemd-logind[745]: Removed session 22.
Dec 05 06:02:43 compute-0 sshd-session[158517]: Accepted publickey for zuul from 192.168.122.30 port 33696 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 06:02:43 compute-0 systemd-logind[745]: New session 23 of user zuul.
Dec 05 06:02:43 compute-0 systemd[1]: Started Session 23 of User zuul.
Dec 05 06:02:43 compute-0 sshd-session[158517]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 06:02:43 compute-0 python3.9[158670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 06:02:44 compute-0 python3.9[158824]: ansible-ansible.builtin.service_facts Invoked
Dec 05 06:02:44 compute-0 network[158841]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 06:02:44 compute-0 network[158842]: 'network-scripts' will be removed from distribution in near future.
Dec 05 06:02:44 compute-0 network[158843]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 06:02:47 compute-0 sudo[159112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olrwullkuxcgiezggznenshnxmulpuyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914567.029334-74-207343012342211/AnsiballZ_setup.py'
Dec 05 06:02:47 compute-0 sudo[159112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:47 compute-0 python3.9[159114]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 05 06:02:47 compute-0 sudo[159112]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:47 compute-0 sudo[159196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxlzjqegipxgetmwotpulrwjteagakih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914567.029334-74-207343012342211/AnsiballZ_dnf.py'
Dec 05 06:02:47 compute-0 sudo[159196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:48 compute-0 python3.9[159198]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 06:02:52 compute-0 sudo[159196]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:52 compute-0 sudo[159349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwzozuiuraoatwhdoblhtunqwgsnawch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914572.4309852-98-231930986904262/AnsiballZ_stat.py'
Dec 05 06:02:52 compute-0 sudo[159349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:52 compute-0 python3.9[159351]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:02:52 compute-0 sudo[159349]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:53 compute-0 sudo[159501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvidjdavnrmzxfqhptauojfthvlywldq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914573.0444305-118-77939443098202/AnsiballZ_command.py'
Dec 05 06:02:53 compute-0 sudo[159501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:53 compute-0 podman[159503]: 2025-12-05 06:02:53.399922494 +0000 UTC m=+0.059296577 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 05 06:02:53 compute-0 python3.9[159504]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:53 compute-0 sudo[159501]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:53 compute-0 sudo[159677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azkkchgeuoznuxvbgfdiafqujzdalhrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914573.701574-138-26658826100335/AnsiballZ_stat.py'
Dec 05 06:02:53 compute-0 sudo[159677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:54 compute-0 python3.9[159679]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:02:54 compute-0 sudo[159677]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:54 compute-0 sudo[159829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poxossjbjrjhautafldkajnfkeudvaux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914574.1448753-154-185153803215609/AnsiballZ_command.py'
Dec 05 06:02:54 compute-0 sudo[159829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:54 compute-0 python3.9[159831]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:02:54 compute-0 sudo[159829]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:54 compute-0 sudo[159982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmkbsgbqmryzhwxxkmtovusxkwpieydo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914574.5967832-170-227162287810030/AnsiballZ_stat.py'
Dec 05 06:02:54 compute-0 sudo[159982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:54 compute-0 python3.9[159984]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:02:54 compute-0 sudo[159982]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:55 compute-0 sudo[160105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sldocujdlzhlyrkcoebmtgzonchxretk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914574.5967832-170-227162287810030/AnsiballZ_copy.py'
Dec 05 06:02:55 compute-0 sudo[160105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:55 compute-0 python3.9[160107]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914574.5967832-170-227162287810030/.source.iscsi _original_basename=.zix0y6m9 follow=False checksum=88d46f5e4468ca0dbef05ebc9f1616a09e374603 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:55 compute-0 sudo[160105]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:55 compute-0 sudo[160257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlxqnujnijyxhquyhtcrfvxpzcrebugy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914575.4910874-200-124541471000500/AnsiballZ_file.py'
Dec 05 06:02:55 compute-0 sudo[160257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:55 compute-0 python3.9[160259]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:55 compute-0 sudo[160257]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:56 compute-0 sudo[160409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtzmvqabfnwwvyopbhnxlutixyzklezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914576.0648165-216-197685331476612/AnsiballZ_lineinfile.py'
Dec 05 06:02:56 compute-0 sudo[160409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:56 compute-0 python3.9[160411]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:02:56 compute-0 sudo[160409]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:57 compute-0 sudo[160561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umlqlsjsfkoyhgzbatzliomlfkfhanhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914576.6713126-234-248479798959566/AnsiballZ_systemd_service.py'
Dec 05 06:02:57 compute-0 sudo[160561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:57 compute-0 python3.9[160563]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:02:57 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 05 06:02:57 compute-0 sudo[160561]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:57 compute-0 sudo[160717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzzjgagyktrzywtjcpsaxeuxiuiuqwhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914577.5239465-250-49213131564655/AnsiballZ_systemd_service.py'
Dec 05 06:02:57 compute-0 sudo[160717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:57 compute-0 python3.9[160719]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:02:57 compute-0 systemd[1]: Reloading.
Dec 05 06:02:58 compute-0 systemd-sysv-generator[160744]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:02:58 compute-0 systemd-rc-local-generator[160741]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:02:58 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 06:02:58 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 05 06:02:58 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 05 06:02:58 compute-0 systemd[1]: Started Open-iSCSI.
Dec 05 06:02:58 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 05 06:02:58 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 05 06:02:58 compute-0 sudo[160717]: pam_unix(sudo:session): session closed for user root
Dec 05 06:02:58 compute-0 sudo[160917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnaatkhzdadjrqugqqbfccwjoilsctsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914578.5366223-272-138564020768872/AnsiballZ_service_facts.py'
Dec 05 06:02:58 compute-0 sudo[160917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:02:58 compute-0 python3.9[160919]: ansible-ansible.builtin.service_facts Invoked
Dec 05 06:02:58 compute-0 network[160936]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 06:02:58 compute-0 network[160937]: 'network-scripts' will be removed from distribution in near future.
Dec 05 06:02:58 compute-0 network[160938]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 06:03:00 compute-0 sudo[160917]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:01 compute-0 sudo[161207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iketlsekckedkzyvgtrmgjchyvvpmxiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914581.2575095-292-247241339250117/AnsiballZ_file.py'
Dec 05 06:03:01 compute-0 sudo[161207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:01 compute-0 python3.9[161209]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 06:03:01 compute-0 sudo[161207]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:02 compute-0 sudo[161367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzvxjqnqhqnadsynzrbifjznwhvdfiqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914581.7188275-308-91005545406628/AnsiballZ_modprobe.py'
Dec 05 06:03:02 compute-0 sudo[161367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:02 compute-0 podman[161333]: 2025-12-05 06:03:02.056649348 +0000 UTC m=+0.039418109 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 06:03:02 compute-0 python3.9[161375]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 05 06:03:02 compute-0 sudo[161367]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:02 compute-0 sudo[161531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqkwqynzzcdxfpqznmydwkyvikjjtswh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914582.349938-324-224676447733097/AnsiballZ_stat.py'
Dec 05 06:03:02 compute-0 sudo[161531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:02 compute-0 python3.9[161533]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:02 compute-0 sudo[161531]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:02 compute-0 sudo[161654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzhfdwxuijuroqffjzavqotakomzwoon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914582.349938-324-224676447733097/AnsiballZ_copy.py'
Dec 05 06:03:02 compute-0 sudo[161654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:03 compute-0 python3.9[161656]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914582.349938-324-224676447733097/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:03 compute-0 sudo[161654]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:03 compute-0 sudo[161806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwrwdhxtjeibiduqpajaylfysulzbhrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914583.2690156-356-150232984714138/AnsiballZ_lineinfile.py'
Dec 05 06:03:03 compute-0 sudo[161806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:03 compute-0 python3.9[161808]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:03 compute-0 sudo[161806]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:04 compute-0 sudo[161958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhsujgtciyolktoafrtrjftghmytnjjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914583.737907-372-176018628488506/AnsiballZ_systemd.py'
Dec 05 06:03:04 compute-0 sudo[161958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:04 compute-0 python3.9[161960]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:03:04 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 06:03:04 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 05 06:03:04 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 05 06:03:04 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 06:03:04 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 06:03:04 compute-0 sudo[161958]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:04 compute-0 sudo[162114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgnixzcuienpblgogvpsrkdkpntkrxlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914584.578876-388-193756010800947/AnsiballZ_file.py'
Dec 05 06:03:04 compute-0 sudo[162114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:04 compute-0 python3.9[162116]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:04 compute-0 sudo[162114]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:05 compute-0 sudo[162266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roozcjzgprgjozmwjnynoxzfcothooiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914585.1162992-406-6357869751938/AnsiballZ_stat.py'
Dec 05 06:03:05 compute-0 sudo[162266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:05 compute-0 python3.9[162268]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:05 compute-0 sudo[162266]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:05 compute-0 sudo[162418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvkuffbkwglzppftxgvowzimkhrxyycm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914585.674083-424-204492143533927/AnsiballZ_stat.py'
Dec 05 06:03:05 compute-0 sudo[162418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:06 compute-0 python3.9[162420]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:06 compute-0 sudo[162418]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:06 compute-0 sudo[162570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frlccdkzrccitpfrcmsicuiktwhvykwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914586.1290672-440-230674069093875/AnsiballZ_stat.py'
Dec 05 06:03:06 compute-0 sudo[162570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:06 compute-0 python3.9[162572]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:06 compute-0 sudo[162570]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:06 compute-0 sudo[162693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibxaxvfidilmjndvyuhfbubpleyigljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914586.1290672-440-230674069093875/AnsiballZ_copy.py'
Dec 05 06:03:06 compute-0 sudo[162693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:06 compute-0 python3.9[162695]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914586.1290672-440-230674069093875/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:06 compute-0 sudo[162693]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:07 compute-0 sudo[162845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwrhwwtfcbiymjhiehgufhhfgjaavrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914586.955965-470-40802152633974/AnsiballZ_command.py'
Dec 05 06:03:07 compute-0 sudo[162845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:07 compute-0 python3.9[162847]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:07 compute-0 sudo[162845]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:07 compute-0 sudo[162998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgtjiixcdwywaoukgrfytqpkhhjaugwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914587.4186823-486-32422168641331/AnsiballZ_lineinfile.py'
Dec 05 06:03:07 compute-0 sudo[162998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:07 compute-0 python3.9[163000]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:07 compute-0 sudo[162998]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:08 compute-0 sudo[163150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmarokiyqivlagnejvtfarkyngfqmpsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914587.880762-502-23902479270119/AnsiballZ_replace.py'
Dec 05 06:03:08 compute-0 sudo[163150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:08 compute-0 python3.9[163152]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:08 compute-0 sudo[163150]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:08 compute-0 sudo[163302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkesyjrqofiepltpbfyjmadjdeuxfaqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914588.47197-518-183978054365644/AnsiballZ_replace.py'
Dec 05 06:03:08 compute-0 sudo[163302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:08 compute-0 python3.9[163304]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:08 compute-0 sudo[163302]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:09 compute-0 sudo[163454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkbddjfvtamnqsaszjnngwpdjyahpqbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914588.9680507-536-146397261501562/AnsiballZ_lineinfile.py'
Dec 05 06:03:09 compute-0 sudo[163454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:09 compute-0 python3.9[163456]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:09 compute-0 sudo[163454]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:09 compute-0 sudo[163606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfygndyetgclvxbboffdgwsobkswusxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914589.4286447-536-143475108603039/AnsiballZ_lineinfile.py'
Dec 05 06:03:09 compute-0 sudo[163606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:09 compute-0 python3.9[163608]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:09 compute-0 sudo[163606]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:10 compute-0 sudo[163758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bedoewzvfaajqkgiqvmekvvtlfpwvkdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914589.8517425-536-125071753344595/AnsiballZ_lineinfile.py'
Dec 05 06:03:10 compute-0 sudo[163758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:10 compute-0 python3.9[163760]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:10 compute-0 sudo[163758]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:10 compute-0 sudo[163910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pogdiomnfsegdqufoykfmreivcaclcmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914590.2700686-536-22485953567535/AnsiballZ_lineinfile.py'
Dec 05 06:03:10 compute-0 sudo[163910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:10 compute-0 python3.9[163912]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:10 compute-0 sudo[163910]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:10 compute-0 sudo[164062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpxbhftjrahetgyhhkvchltofvvciiaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914590.7168624-594-193774343794207/AnsiballZ_stat.py'
Dec 05 06:03:10 compute-0 sudo[164062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:11 compute-0 python3.9[164064]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:11 compute-0 sudo[164062]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:11 compute-0 sudo[164216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urolekffyilbxrnxkuvwubettsyeosjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914591.18738-610-231309587312193/AnsiballZ_file.py'
Dec 05 06:03:11 compute-0 sudo[164216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:11 compute-0 python3.9[164218]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:11 compute-0 sudo[164216]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:11 compute-0 sudo[164368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnbhxaqtxgqaitplteqijmkbyuwwkmez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914591.7477758-628-185172801700797/AnsiballZ_file.py'
Dec 05 06:03:11 compute-0 sudo[164368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:12 compute-0 python3.9[164370]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:12 compute-0 sudo[164368]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:12 compute-0 sudo[164520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkhuuuuagywoeaayfolnmigomxcubcpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914592.2146049-644-251329666330562/AnsiballZ_stat.py'
Dec 05 06:03:12 compute-0 sudo[164520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:12 compute-0 python3.9[164522]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:12 compute-0 sudo[164520]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:12 compute-0 sudo[164598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibsqhwblxnpxqyqupmlxwbdtvskmqbhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914592.2146049-644-251329666330562/AnsiballZ_file.py'
Dec 05 06:03:12 compute-0 sudo[164598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:12 compute-0 python3.9[164600]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:12 compute-0 sudo[164598]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:13 compute-0 sudo[164750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkckobbldzhqdairuykhmlmetfaucqar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914592.958025-644-150710562910470/AnsiballZ_stat.py'
Dec 05 06:03:13 compute-0 sudo[164750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:13 compute-0 python3.9[164752]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:13 compute-0 sudo[164750]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:13 compute-0 sudo[164828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lthrtycorjpotthimcazkhsuyhgtbsis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914592.958025-644-150710562910470/AnsiballZ_file.py'
Dec 05 06:03:13 compute-0 sudo[164828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:13 compute-0 python3.9[164830]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:13 compute-0 sudo[164828]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:13 compute-0 sudo[164980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fedryixzvorjhaylilexywekrovkufif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914593.7097945-690-116785618192838/AnsiballZ_file.py'
Dec 05 06:03:13 compute-0 sudo[164980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:14 compute-0 python3.9[164982]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:14 compute-0 sudo[164980]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:14 compute-0 sudo[165132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zndyofstqhvdvuqidvxjuxpnijcqjsvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914594.1592805-706-252091954979763/AnsiballZ_stat.py'
Dec 05 06:03:14 compute-0 sudo[165132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:14 compute-0 python3.9[165134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:14 compute-0 sudo[165132]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:14 compute-0 sudo[165210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypsfxjpcygzfdboarkajezmsuekgoveg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914594.1592805-706-252091954979763/AnsiballZ_file.py'
Dec 05 06:03:14 compute-0 sudo[165210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:14 compute-0 python3.9[165212]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:14 compute-0 sudo[165210]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:15 compute-0 sudo[165362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytxjocclintmjktmexjemcawykdnffb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914594.962306-730-47891551844531/AnsiballZ_stat.py'
Dec 05 06:03:15 compute-0 sudo[165362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:15 compute-0 python3.9[165364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:15 compute-0 sudo[165362]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:15 compute-0 sudo[165440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcuebinxoweugbokgptjxypumwdkpcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914594.962306-730-47891551844531/AnsiballZ_file.py'
Dec 05 06:03:15 compute-0 sudo[165440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:15 compute-0 python3.9[165442]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:15 compute-0 sudo[165440]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:15 compute-0 sudo[165592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plhdrikuqokvarkcnqkujawjylwqlldn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914595.7342985-754-215020815710290/AnsiballZ_systemd.py'
Dec 05 06:03:15 compute-0 sudo[165592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:16 compute-0 python3.9[165594]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:16 compute-0 systemd[1]: Reloading.
Dec 05 06:03:16 compute-0 systemd-sysv-generator[165617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:16 compute-0 systemd-rc-local-generator[165614]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:16 compute-0 sudo[165592]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:16 compute-0 sudo[165781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfjikmlxueiwxxkuzphdhyrecdutpffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914596.5267332-770-174072066875836/AnsiballZ_stat.py'
Dec 05 06:03:16 compute-0 sudo[165781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:16 compute-0 python3.9[165783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:16 compute-0 sudo[165781]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:17 compute-0 sudo[165859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfrgkrtcalxgnbxnljygifhhqpeldcic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914596.5267332-770-174072066875836/AnsiballZ_file.py'
Dec 05 06:03:17 compute-0 sudo[165859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:17 compute-0 python3.9[165861]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:17 compute-0 sudo[165859]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:17 compute-0 sudo[166011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wewrxvdbhcovmudnkuxzslwavtuieowx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914597.294145-794-162003451508662/AnsiballZ_stat.py'
Dec 05 06:03:17 compute-0 sudo[166011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:17 compute-0 python3.9[166013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:17 compute-0 sudo[166011]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:17 compute-0 sudo[166089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrchecheixjsnaeijrnocpnrdhzpznrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914597.294145-794-162003451508662/AnsiballZ_file.py'
Dec 05 06:03:17 compute-0 sudo[166089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:17 compute-0 python3.9[166091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:17 compute-0 sudo[166089]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:18 compute-0 sudo[166241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlmfbzttyoxhcnlgrwwhevfxznztdmzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914598.0664694-818-28766710100474/AnsiballZ_systemd.py'
Dec 05 06:03:18 compute-0 sudo[166241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:18 compute-0 python3.9[166243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:18 compute-0 systemd[1]: Reloading.
Dec 05 06:03:18 compute-0 systemd-rc-local-generator[166267]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:18 compute-0 systemd-sysv-generator[166271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:18 compute-0 systemd[1]: Starting Create netns directory...
Dec 05 06:03:18 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 05 06:03:18 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 05 06:03:18 compute-0 systemd[1]: Finished Create netns directory.
Dec 05 06:03:18 compute-0 sudo[166241]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:19 compute-0 sudo[166433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ganlyswtvkdnpoymqjikavxsmpqzlgae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914598.9866862-838-258421444401472/AnsiballZ_file.py'
Dec 05 06:03:19 compute-0 sudo[166433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:19 compute-0 python3.9[166435]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:19 compute-0 sudo[166433]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:19 compute-0 sudo[166585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqlltgepkvdqythxqrukocaqjuroeasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914599.4744768-854-113309819328011/AnsiballZ_stat.py'
Dec 05 06:03:19 compute-0 sudo[166585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:19 compute-0 python3.9[166587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:19 compute-0 sudo[166585]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:20 compute-0 sudo[166708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzjjqvxwbsggpzyqkjerxqrhnqbymjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914599.4744768-854-113309819328011/AnsiballZ_copy.py'
Dec 05 06:03:20 compute-0 sudo[166708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:20 compute-0 python3.9[166710]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914599.4744768-854-113309819328011/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:20 compute-0 sudo[166708]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:20 compute-0 sudo[166860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egsmpzlzmkchsckowtpoyvfdyzexfdxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914600.483119-888-114068499641965/AnsiballZ_file.py'
Dec 05 06:03:20 compute-0 sudo[166860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:20 compute-0 python3.9[166862]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:03:20 compute-0 sudo[166860]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:21 compute-0 sudo[167012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-penxrwocjwcdbqxkyocurxucvodlwzvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914601.1151748-904-262900818114736/AnsiballZ_stat.py'
Dec 05 06:03:21 compute-0 sudo[167012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:21 compute-0 python3.9[167014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:21 compute-0 sudo[167012]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:21 compute-0 sudo[167135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcgaurkpwtpkdndzlqlcoytjwtmyikve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914601.1151748-904-262900818114736/AnsiballZ_copy.py'
Dec 05 06:03:21 compute-0 sudo[167135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:21 compute-0 python3.9[167137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914601.1151748-904-262900818114736/.source.json _original_basename=.drjg9jxm follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:21 compute-0 sudo[167135]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:22 compute-0 sudo[167287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmarszjqlmhrxugjlhnxiadymbshcbap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914602.021021-934-36658089842240/AnsiballZ_file.py'
Dec 05 06:03:22 compute-0 sudo[167287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:22 compute-0 python3.9[167289]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:22 compute-0 sudo[167287]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:22 compute-0 sudo[167439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrnjgwljbngzznybfkzfbixhxwlupmno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914602.5251164-950-220681217913142/AnsiballZ_stat.py'
Dec 05 06:03:22 compute-0 sudo[167439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:22 compute-0 sudo[167439]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:23 compute-0 sudo[167562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvbkbuvtslhqnitsldcjhrwyeimiutub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914602.5251164-950-220681217913142/AnsiballZ_copy.py'
Dec 05 06:03:23 compute-0 sudo[167562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:23 compute-0 sudo[167562]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:23 compute-0 sudo[167723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rehmrlavhaoggvarsjpmnvkxczayyume ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914603.528389-984-1464769763604/AnsiballZ_container_config_data.py'
Dec 05 06:03:23 compute-0 sudo[167723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:23 compute-0 podman[167688]: 2025-12-05 06:03:23.867238463 +0000 UTC m=+0.056158645 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:03:24 compute-0 python3.9[167730]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 05 06:03:24 compute-0 sudo[167723]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:24 compute-0 sudo[167889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqrewrztxlmejoyeommnkluoaqlnxcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914604.200506-1002-279403022559218/AnsiballZ_container_config_hash.py'
Dec 05 06:03:24 compute-0 sudo[167889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:24 compute-0 python3.9[167891]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 06:03:24 compute-0 sudo[167889]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:25 compute-0 sudo[168041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtqsgkfzqwxmrqtdikfljzxdrkysmkoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914604.9057956-1020-245550793877652/AnsiballZ_podman_container_info.py'
Dec 05 06:03:25 compute-0 sudo[168041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:25 compute-0 python3.9[168043]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 05 06:03:25 compute-0 sudo[168041]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:26 compute-0 sudo[168213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripqrglrwewuhysvmrxpedxszstubdij ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914605.9081426-1046-29491895726912/AnsiballZ_edpm_container_manage.py'
Dec 05 06:03:26 compute-0 sudo[168213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:26 compute-0 python3[168215]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 06:03:26 compute-0 podman[168244]: 2025-12-05 06:03:26.569913547 +0000 UTC m=+0.028604515 container create 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd)
Dec 05 06:03:26 compute-0 podman[168244]: 2025-12-05 06:03:26.554187959 +0000 UTC m=+0.012878938 image pull e33420805289bc187306032371d5d431ac611775aa0ba0a9b90183e961a97dc0 quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current
Dec 05 06:03:26 compute-0 python3[168215]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current
Dec 05 06:03:26 compute-0 sudo[168213]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:26 compute-0 sudo[168421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xydlsqeaaytprogrvivksmgppfbbjusq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914606.7847304-1062-150914219925247/AnsiballZ_stat.py'
Dec 05 06:03:26 compute-0 sudo[168421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:27 compute-0 python3.9[168423]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:27 compute-0 sudo[168421]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:27 compute-0 sudo[168575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrqcyxibonskgysztiyeangykopoawvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914607.3449938-1080-44036350201789/AnsiballZ_file.py'
Dec 05 06:03:27 compute-0 sudo[168575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:27 compute-0 python3.9[168577]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:27 compute-0 sudo[168575]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:27 compute-0 sudo[168651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpuzcblenwqymftdainagknlqchzguju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914607.3449938-1080-44036350201789/AnsiballZ_stat.py'
Dec 05 06:03:27 compute-0 sudo[168651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:27 compute-0 python3.9[168653]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:27 compute-0 sudo[168651]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:28 compute-0 sudo[168802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imhydqkhglrfhqukxnjlactccstnsnnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914608.014483-1080-137314292805284/AnsiballZ_copy.py'
Dec 05 06:03:28 compute-0 sudo[168802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:28 compute-0 python3.9[168804]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914608.014483-1080-137314292805284/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:28 compute-0 sudo[168802]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:28 compute-0 sudo[168878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyldqowthxqemxuqeggvxuoiufevpxjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914608.014483-1080-137314292805284/AnsiballZ_systemd.py'
Dec 05 06:03:28 compute-0 sudo[168878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:28 compute-0 python3.9[168880]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:03:28 compute-0 systemd[1]: Reloading.
Dec 05 06:03:28 compute-0 systemd-rc-local-generator[168907]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:28 compute-0 systemd-sysv-generator[168910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:29 compute-0 sudo[168878]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:29 compute-0 sudo[168989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipqfuoibdzhjjbhjdowjbfobefoxnaup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914608.014483-1080-137314292805284/AnsiballZ_systemd.py'
Dec 05 06:03:29 compute-0 sudo[168989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:03:29.482 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:03:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:03:29.483 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:03:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:03:29.483 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:03:29 compute-0 python3.9[168991]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:29 compute-0 systemd[1]: Reloading.
Dec 05 06:03:29 compute-0 systemd-sysv-generator[169018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:29 compute-0 systemd-rc-local-generator[169015]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:29 compute-0 systemd[1]: Starting multipathd container...
Dec 05 06:03:29 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:03:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da6df5c01b1f77a7a331351318f34cb8f5c662e3294c8a6b8cc3ecd20b71c68/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 06:03:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da6df5c01b1f77a7a331351318f34cb8f5c662e3294c8a6b8cc3ecd20b71c68/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 06:03:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.
Dec 05 06:03:29 compute-0 podman[169032]: 2025-12-05 06:03:29.840637596 +0000 UTC m=+0.083948681 container init 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:03:29 compute-0 multipathd[169044]: + sudo -E kolla_set_configs
Dec 05 06:03:29 compute-0 sudo[169050]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 06:03:29 compute-0 sudo[169050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 06:03:29 compute-0 podman[169032]: 2025-12-05 06:03:29.859792663 +0000 UTC m=+0.103103747 container start 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:03:29 compute-0 podman[169032]: multipathd
Dec 05 06:03:29 compute-0 systemd[1]: Started multipathd container.
Dec 05 06:03:29 compute-0 sudo[168989]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:29 compute-0 multipathd[169044]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 06:03:29 compute-0 multipathd[169044]: INFO:__main__:Validating config file
Dec 05 06:03:29 compute-0 multipathd[169044]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 06:03:29 compute-0 multipathd[169044]: INFO:__main__:Writing out command to execute
Dec 05 06:03:29 compute-0 sudo[169050]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:29 compute-0 multipathd[169044]: ++ cat /run_command
Dec 05 06:03:29 compute-0 multipathd[169044]: + CMD='/usr/sbin/multipathd -d'
Dec 05 06:03:29 compute-0 multipathd[169044]: + ARGS=
Dec 05 06:03:29 compute-0 multipathd[169044]: + sudo kolla_copy_cacerts
Dec 05 06:03:29 compute-0 sudo[169073]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 06:03:29 compute-0 podman[169051]: 2025-12-05 06:03:29.915388958 +0000 UTC m=+0.047022810 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:03:29 compute-0 sudo[169073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 06:03:29 compute-0 sudo[169073]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:29 compute-0 systemd[1]: 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-1a23c012cf0d668.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 06:03:29 compute-0 systemd[1]: 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-1a23c012cf0d668.service: Failed with result 'exit-code'.
Dec 05 06:03:29 compute-0 multipathd[169044]: + [[ ! -n '' ]]
Dec 05 06:03:29 compute-0 multipathd[169044]: + . kolla_extend_start
Dec 05 06:03:29 compute-0 multipathd[169044]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 06:03:29 compute-0 multipathd[169044]: Running command: '/usr/sbin/multipathd -d'
Dec 05 06:03:29 compute-0 multipathd[169044]: + umask 0022
Dec 05 06:03:29 compute-0 multipathd[169044]: + exec /usr/sbin/multipathd -d
Dec 05 06:03:29 compute-0 multipathd[169044]: 2488.506117 | multipathd v0.9.9: start up
Dec 05 06:03:29 compute-0 multipathd[169044]: 2488.511239 | reconfigure: setting up paths and maps
Dec 05 06:03:29 compute-0 multipathd[169044]: 2488.511925 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Dec 05 06:03:29 compute-0 multipathd[169044]: 2488.512361 | updated bindings file /etc/multipath/bindings
Dec 05 06:03:30 compute-0 python3.9[169230]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:03:30 compute-0 sudo[169382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exmderykzepwxasfhpzqjwvhbtwauwji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914610.4807744-1152-85531649533077/AnsiballZ_command.py'
Dec 05 06:03:30 compute-0 sudo[169382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:30 compute-0 python3.9[169384]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:30 compute-0 sudo[169382]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:31 compute-0 sudo[169543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxqswxavfehomwjnvlslbrqtfftobxyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914610.9879742-1168-244222313995892/AnsiballZ_systemd.py'
Dec 05 06:03:31 compute-0 sudo[169543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:31 compute-0 python3.9[169545]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:03:31 compute-0 systemd[1]: Stopping multipathd container...
Dec 05 06:03:31 compute-0 multipathd[169044]: 2490.054091 | multipathd: shut down
Dec 05 06:03:31 compute-0 systemd[1]: libpod-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.scope: Deactivated successfully.
Dec 05 06:03:31 compute-0 podman[169549]: 2025-12-05 06:03:31.508686581 +0000 UTC m=+0.054606606 container died 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:03:31 compute-0 systemd[1]: 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-1a23c012cf0d668.timer: Deactivated successfully.
Dec 05 06:03:31 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.
Dec 05 06:03:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-userdata-shm.mount: Deactivated successfully.
Dec 05 06:03:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-9da6df5c01b1f77a7a331351318f34cb8f5c662e3294c8a6b8cc3ecd20b71c68-merged.mount: Deactivated successfully.
Dec 05 06:03:31 compute-0 podman[169549]: 2025-12-05 06:03:31.543339409 +0000 UTC m=+0.089259433 container cleanup 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, container_name=multipathd, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:03:31 compute-0 podman[169549]: multipathd
Dec 05 06:03:31 compute-0 podman[169571]: multipathd
Dec 05 06:03:31 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 05 06:03:31 compute-0 systemd[1]: Stopped multipathd container.
Dec 05 06:03:31 compute-0 systemd[1]: Starting multipathd container...
Dec 05 06:03:31 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:03:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da6df5c01b1f77a7a331351318f34cb8f5c662e3294c8a6b8cc3ecd20b71c68/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 06:03:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9da6df5c01b1f77a7a331351318f34cb8f5c662e3294c8a6b8cc3ecd20b71c68/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 06:03:31 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.
Dec 05 06:03:31 compute-0 podman[169580]: 2025-12-05 06:03:31.67237006 +0000 UTC m=+0.067350721 container init 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:03:31 compute-0 multipathd[169592]: + sudo -E kolla_set_configs
Dec 05 06:03:31 compute-0 sudo[169598]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 05 06:03:31 compute-0 sudo[169598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 06:03:31 compute-0 podman[169580]: multipathd
Dec 05 06:03:31 compute-0 podman[169580]: 2025-12-05 06:03:31.69062258 +0000 UTC m=+0.085603220 container start 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 06:03:31 compute-0 systemd[1]: Started multipathd container.
Dec 05 06:03:31 compute-0 sudo[169543]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:31 compute-0 multipathd[169592]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 06:03:31 compute-0 multipathd[169592]: INFO:__main__:Validating config file
Dec 05 06:03:31 compute-0 multipathd[169592]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 06:03:31 compute-0 multipathd[169592]: INFO:__main__:Writing out command to execute
Dec 05 06:03:31 compute-0 sudo[169598]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:31 compute-0 multipathd[169592]: ++ cat /run_command
Dec 05 06:03:31 compute-0 multipathd[169592]: + CMD='/usr/sbin/multipathd -d'
Dec 05 06:03:31 compute-0 multipathd[169592]: + ARGS=
Dec 05 06:03:31 compute-0 multipathd[169592]: + sudo kolla_copy_cacerts
Dec 05 06:03:31 compute-0 podman[169599]: 2025-12-05 06:03:31.747439389 +0000 UTC m=+0.046536446 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:03:31 compute-0 sudo[169625]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 05 06:03:31 compute-0 sudo[169625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 05 06:03:31 compute-0 systemd[1]: 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-41bc4806df31dab8.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 06:03:31 compute-0 systemd[1]: 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0-41bc4806df31dab8.service: Failed with result 'exit-code'.
Dec 05 06:03:31 compute-0 sudo[169625]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:31 compute-0 multipathd[169592]: + [[ ! -n '' ]]
Dec 05 06:03:31 compute-0 multipathd[169592]: + . kolla_extend_start
Dec 05 06:03:31 compute-0 multipathd[169592]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 05 06:03:31 compute-0 multipathd[169592]: Running command: '/usr/sbin/multipathd -d'
Dec 05 06:03:31 compute-0 multipathd[169592]: + umask 0022
Dec 05 06:03:31 compute-0 multipathd[169592]: + exec /usr/sbin/multipathd -d
Dec 05 06:03:31 compute-0 multipathd[169592]: 2490.342241 | multipathd v0.9.9: start up
Dec 05 06:03:31 compute-0 multipathd[169592]: 2490.347241 | reconfigure: setting up paths and maps
Dec 05 06:03:32 compute-0 sudo[169779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awhzitmlezbsvsuiwusbfiwdfjpsqdnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914611.8561852-1184-273700620901429/AnsiballZ_file.py'
Dec 05 06:03:32 compute-0 sudo[169779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:32 compute-0 python3.9[169781]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:32 compute-0 sudo[169779]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:32 compute-0 podman[169806]: 2025-12-05 06:03:32.453348706 +0000 UTC m=+0.035853214 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:03:32 compute-0 sudo[169947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhonbybqbfarrgyvzrplgynzwtzecrpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914612.5208254-1208-146392931670649/AnsiballZ_file.py'
Dec 05 06:03:32 compute-0 sudo[169947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:32 compute-0 python3.9[169949]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 05 06:03:32 compute-0 sudo[169947]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:33 compute-0 sudo[170099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgxnikzuqtbfjreqedzvfdmubeqdyqyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914612.9897435-1224-11299541623581/AnsiballZ_modprobe.py'
Dec 05 06:03:33 compute-0 sudo[170099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:33 compute-0 python3.9[170101]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 05 06:03:33 compute-0 kernel: Key type psk registered
Dec 05 06:03:33 compute-0 sudo[170099]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:33 compute-0 sudo[170261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apxzdwqguffvprlmqllosboheamnqoig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914613.5459313-1240-219366060145923/AnsiballZ_stat.py'
Dec 05 06:03:33 compute-0 sudo[170261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:33 compute-0 python3.9[170263]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:03:33 compute-0 sudo[170261]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:34 compute-0 sudo[170384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwwekxxgbpcsfrtnaewisjavgvalrtgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914613.5459313-1240-219366060145923/AnsiballZ_copy.py'
Dec 05 06:03:34 compute-0 sudo[170384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:34 compute-0 python3.9[170386]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914613.5459313-1240-219366060145923/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:34 compute-0 sudo[170384]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:34 compute-0 sudo[170536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xopfucawizuylwsygwnhbsybwsmdkaos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914614.4825134-1272-4325173622878/AnsiballZ_lineinfile.py'
Dec 05 06:03:34 compute-0 sudo[170536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:34 compute-0 python3.9[170538]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:34 compute-0 sudo[170536]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:35 compute-0 sudo[170688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcnnyxgtkpxebtwxbeoeajybywdzbrsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914614.93747-1288-186774159338949/AnsiballZ_systemd.py'
Dec 05 06:03:35 compute-0 sudo[170688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:35 compute-0 python3.9[170690]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:03:35 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 05 06:03:35 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 05 06:03:35 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 05 06:03:35 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 05 06:03:35 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 05 06:03:35 compute-0 sudo[170688]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:35 compute-0 sudo[170844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ownhfifcnwuedoslbmaiemondlmdwjcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914615.6002934-1304-65684457740556/AnsiballZ_dnf.py'
Dec 05 06:03:35 compute-0 sudo[170844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:35 compute-0 python3.9[170846]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 05 06:03:37 compute-0 systemd[1]: Reloading.
Dec 05 06:03:37 compute-0 systemd-rc-local-generator[170875]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:37 compute-0 systemd-sysv-generator[170878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:38 compute-0 systemd[1]: Reloading.
Dec 05 06:03:38 compute-0 systemd-rc-local-generator[170906]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:38 compute-0 systemd-sysv-generator[170909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:38 compute-0 virtnodedevd[152543]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 05 06:03:38 compute-0 virtnodedevd[152543]: hostname: compute-0
Dec 05 06:03:38 compute-0 virtnodedevd[152543]: nl_recv returned with error: No buffer space available
Dec 05 06:03:38 compute-0 systemd-logind[745]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 05 06:03:38 compute-0 systemd-logind[745]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 05 06:03:38 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 05 06:03:38 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 05 06:03:38 compute-0 systemd[1]: Reloading.
Dec 05 06:03:38 compute-0 systemd-sysv-generator[171020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:38 compute-0 systemd-rc-local-generator[171016]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:38 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 05 06:03:39 compute-0 sudo[170844]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:39 compute-0 sudo[172286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkprukykuhasmedhgbzzcoxojhvroijv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914619.2273953-1320-148623941672256/AnsiballZ_systemd_service.py'
Dec 05 06:03:39 compute-0 sudo[172286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:39 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 05 06:03:39 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 05 06:03:39 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.059s CPU time.
Dec 05 06:03:39 compute-0 systemd[1]: run-r7f751bdd90c84d62b3d0c264f95a5501.service: Deactivated successfully.
Dec 05 06:03:39 compute-0 python3.9[172304]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:03:39 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 05 06:03:39 compute-0 iscsid[160759]: iscsid shutting down.
Dec 05 06:03:39 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 05 06:03:39 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 05 06:03:39 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 05 06:03:39 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 05 06:03:39 compute-0 systemd[1]: Started Open-iSCSI.
Dec 05 06:03:39 compute-0 sudo[172286]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:40 compute-0 python3.9[172467]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 06:03:40 compute-0 sudo[172621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wewsvlkywvmhcbpucdybzngiowsxrlly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914620.6614313-1355-238058773850351/AnsiballZ_file.py'
Dec 05 06:03:40 compute-0 sudo[172621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:40 compute-0 python3.9[172623]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:41 compute-0 sudo[172621]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:41 compute-0 sudo[172773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pphgjxfghrqravfajiuimlqsxpfbwbav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914621.3020504-1377-57388340179058/AnsiballZ_systemd_service.py'
Dec 05 06:03:41 compute-0 sudo[172773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:41 compute-0 python3.9[172775]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:03:41 compute-0 systemd[1]: Reloading.
Dec 05 06:03:41 compute-0 systemd-rc-local-generator[172799]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:41 compute-0 systemd-sysv-generator[172802]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:41 compute-0 sudo[172773]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:42 compute-0 python3.9[172960]: ansible-ansible.builtin.service_facts Invoked
Dec 05 06:03:42 compute-0 network[172977]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 06:03:42 compute-0 network[172978]: 'network-scripts' will be removed from distribution in near future.
Dec 05 06:03:42 compute-0 network[172979]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 06:03:44 compute-0 sudo[173251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axjaaajlayaujezfvhgxhhdccoctdywi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914624.6156096-1415-137283939996117/AnsiballZ_systemd_service.py'
Dec 05 06:03:44 compute-0 sudo[173251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:45 compute-0 python3.9[173253]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:45 compute-0 sudo[173251]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:45 compute-0 sudo[173404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzbjrmbafhsgmssvjvrrdlfmnkaximhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914625.1520188-1415-98932215767256/AnsiballZ_systemd_service.py'
Dec 05 06:03:45 compute-0 sudo[173404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:45 compute-0 python3.9[173406]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:45 compute-0 sudo[173404]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:45 compute-0 sudo[173557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziugouzqgjnekhrpnjlcmmakulecljxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914625.6708324-1415-164801938960975/AnsiballZ_systemd_service.py'
Dec 05 06:03:45 compute-0 sudo[173557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:46 compute-0 python3.9[173559]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:46 compute-0 sudo[173557]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:46 compute-0 sudo[173710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eevvvfiaihkcdguoullkyfkxswlqqpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914626.197836-1415-30796710761913/AnsiballZ_systemd_service.py'
Dec 05 06:03:46 compute-0 sudo[173710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:46 compute-0 python3.9[173712]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:46 compute-0 sudo[173710]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:46 compute-0 sudo[173863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsglcfzbdurgpzijbbwsnxamxobffgtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914626.7099962-1415-68111253239199/AnsiballZ_systemd_service.py'
Dec 05 06:03:46 compute-0 sudo[173863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:47 compute-0 python3.9[173865]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:47 compute-0 sudo[173863]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:47 compute-0 sudo[174016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adoztcetmicsokmqfjhnxvbgtrzsblzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914627.2273455-1415-219762849852119/AnsiballZ_systemd_service.py'
Dec 05 06:03:47 compute-0 sudo[174016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:47 compute-0 python3.9[174018]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:47 compute-0 sudo[174016]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:47 compute-0 sudo[174169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knkwrwaoddpfvgltrsxxtnjqnfcxwill ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914627.7572293-1415-127302398223229/AnsiballZ_systemd_service.py'
Dec 05 06:03:47 compute-0 sudo[174169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:48 compute-0 python3.9[174171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:48 compute-0 sudo[174169]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:48 compute-0 sudo[174322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdhcippdmgsvdbtjdkevuicgazhfqjve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914628.2770996-1415-230753573921305/AnsiballZ_systemd_service.py'
Dec 05 06:03:48 compute-0 sudo[174322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:48 compute-0 python3.9[174324]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:03:48 compute-0 sudo[174322]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:49 compute-0 sudo[174475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gisgyddotznzrxtfcfygrecmbkgnwsoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914629.005806-1533-6666002894099/AnsiballZ_file.py'
Dec 05 06:03:49 compute-0 sudo[174475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:49 compute-0 python3.9[174477]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:49 compute-0 sudo[174475]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:49 compute-0 sudo[174627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdeiomiabbziehkvbbgfmtfglwcglijh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914629.5693994-1533-39901184627541/AnsiballZ_file.py'
Dec 05 06:03:49 compute-0 sudo[174627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:49 compute-0 python3.9[174629]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:49 compute-0 sudo[174627]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:50 compute-0 sudo[174779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxoczjrssaxuenezscvxkjasygrbitxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914629.9700081-1533-18685248727209/AnsiballZ_file.py'
Dec 05 06:03:50 compute-0 sudo[174779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:50 compute-0 python3.9[174781]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:50 compute-0 sudo[174779]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:50 compute-0 sudo[174931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cexalsgndacfypzawpukvwiqspjcobwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914630.3687916-1533-112542964681620/AnsiballZ_file.py'
Dec 05 06:03:50 compute-0 sudo[174931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:50 compute-0 python3.9[174933]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:50 compute-0 sudo[174931]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:50 compute-0 sudo[175083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hihzneqyeepiclkwdiatufxkdvnxfrae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914630.7743115-1533-179687652923110/AnsiballZ_file.py'
Dec 05 06:03:50 compute-0 sudo[175083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:51 compute-0 python3.9[175085]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:51 compute-0 sudo[175083]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:51 compute-0 sudo[175235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plhoqlixexfatkgejiyjqfjpydxbykfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914631.219826-1533-228874825079697/AnsiballZ_file.py'
Dec 05 06:03:51 compute-0 sudo[175235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:51 compute-0 python3.9[175237]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:51 compute-0 sudo[175235]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:51 compute-0 sudo[175387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktivzghyihpgwwzoefmisrwhpwfbowsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914631.6283443-1533-247344721146567/AnsiballZ_file.py'
Dec 05 06:03:51 compute-0 sudo[175387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:51 compute-0 python3.9[175389]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:51 compute-0 sudo[175387]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:52 compute-0 sudo[175539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgopylofqmlgrpzsmhzvdpjotbxlglwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914632.05074-1533-244936162923546/AnsiballZ_file.py'
Dec 05 06:03:52 compute-0 sudo[175539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:52 compute-0 python3.9[175541]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:52 compute-0 sudo[175539]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:52 compute-0 sudo[175691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aricehsahtqchwpsaoumjcsyslobkzbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914632.5122843-1647-66699308414392/AnsiballZ_file.py'
Dec 05 06:03:52 compute-0 sudo[175691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:52 compute-0 python3.9[175693]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:52 compute-0 sudo[175691]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:53 compute-0 sudo[175843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlrheuxzhxtngfuytcokradcjdoerdjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914632.9232917-1647-219401008052996/AnsiballZ_file.py'
Dec 05 06:03:53 compute-0 sudo[175843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:53 compute-0 python3.9[175845]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:53 compute-0 sudo[175843]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:53 compute-0 sudo[175995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozvbymesgxtpsrxsohpmtrilkpfngfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914633.3238392-1647-166615406158098/AnsiballZ_file.py'
Dec 05 06:03:53 compute-0 sudo[175995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:53 compute-0 python3.9[175997]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:53 compute-0 sudo[175995]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:53 compute-0 sudo[176147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahdmqbwmmaocdpotlncjtulxtbzbvslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914633.734525-1647-222895296990355/AnsiballZ_file.py'
Dec 05 06:03:53 compute-0 sudo[176147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:53 compute-0 podman[176149]: 2025-12-05 06:03:53.968485763 +0000 UTC m=+0.057431786 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 05 06:03:54 compute-0 python3.9[176150]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:54 compute-0 sudo[176147]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:54 compute-0 sudo[176323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbdzoddbdkpdgjoxmjrimmfeqexmzuak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914634.1489396-1647-34725868706587/AnsiballZ_file.py'
Dec 05 06:03:54 compute-0 sudo[176323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:54 compute-0 python3.9[176325]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:54 compute-0 sudo[176323]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:54 compute-0 sudo[176475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geeoskksrdgniwvnlgxewxrnnanhfarx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914634.6686132-1647-143208139506411/AnsiballZ_file.py'
Dec 05 06:03:54 compute-0 sudo[176475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:54 compute-0 python3.9[176477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:54 compute-0 sudo[176475]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:55 compute-0 sudo[176627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-advkgzevlymwclxwlhhzwuhprogtsori ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914635.0639985-1647-252576632061937/AnsiballZ_file.py'
Dec 05 06:03:55 compute-0 sudo[176627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:55 compute-0 python3.9[176629]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:55 compute-0 sudo[176627]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:55 compute-0 sudo[176779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxwzxbkqsjcgilijtqkqqjmurefiahws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914635.471041-1647-36752923013655/AnsiballZ_file.py'
Dec 05 06:03:55 compute-0 sudo[176779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:55 compute-0 python3.9[176781]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:03:55 compute-0 sudo[176779]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:56 compute-0 sudo[176931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajebrcdfxcwbdzzdcfddlnqntkdfrtnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914636.0344832-1763-163577478021664/AnsiballZ_command.py'
Dec 05 06:03:56 compute-0 sudo[176931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:56 compute-0 python3.9[176933]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:56 compute-0 sudo[176931]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:56 compute-0 python3.9[177085]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 06:03:57 compute-0 sudo[177235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwvvzdkzkqkpnunfoivqiyuynhlymonx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914637.13824-1799-45775689099938/AnsiballZ_systemd_service.py'
Dec 05 06:03:57 compute-0 sudo[177235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:57 compute-0 python3.9[177237]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:03:57 compute-0 systemd[1]: Reloading.
Dec 05 06:03:57 compute-0 systemd-sysv-generator[177264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:03:57 compute-0 systemd-rc-local-generator[177261]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:03:57 compute-0 sudo[177235]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:58 compute-0 sudo[177422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yewjmacrzfkmauwfweripvajudhngpxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914637.9101336-1815-253319623888665/AnsiballZ_command.py'
Dec 05 06:03:58 compute-0 sudo[177422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:58 compute-0 python3.9[177424]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:58 compute-0 sudo[177422]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:58 compute-0 sudo[177575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpvtkudssqunlkvxfirkcplsmalwucfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914638.3373513-1815-180018580171288/AnsiballZ_command.py'
Dec 05 06:03:58 compute-0 sudo[177575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:58 compute-0 python3.9[177577]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:58 compute-0 sudo[177575]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:58 compute-0 sudo[177728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwdrzckscxxfgcmrvimqoafetusebttw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914638.7602942-1815-253057682609258/AnsiballZ_command.py'
Dec 05 06:03:58 compute-0 sudo[177728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:59 compute-0 python3.9[177730]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:59 compute-0 sudo[177728]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:59 compute-0 sudo[177881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqjucxvquxklwauypplambdnnsopcau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914639.306628-1815-18935562519376/AnsiballZ_command.py'
Dec 05 06:03:59 compute-0 sudo[177881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:03:59 compute-0 python3.9[177883]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:03:59 compute-0 sudo[177881]: pam_unix(sudo:session): session closed for user root
Dec 05 06:03:59 compute-0 sudo[178034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfhehwteyzjgeepsxrgkyroavelvzztc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914639.730936-1815-180564883757253/AnsiballZ_command.py'
Dec 05 06:03:59 compute-0 sudo[178034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:00 compute-0 python3.9[178036]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:00 compute-0 sudo[178034]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:00 compute-0 sudo[178187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrfinakptjtwadvfwsucvsqzfhzsvbpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914640.1536553-1815-255161779181120/AnsiballZ_command.py'
Dec 05 06:04:00 compute-0 sudo[178187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:00 compute-0 python3.9[178189]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:00 compute-0 sudo[178187]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:00 compute-0 sudo[178340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqxruviieyeufjyyaquyspjgpjypducd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914640.5680013-1815-83144303053742/AnsiballZ_command.py'
Dec 05 06:04:00 compute-0 sudo[178340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:00 compute-0 python3.9[178342]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:00 compute-0 sudo[178340]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:01 compute-0 sudo[178493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weegvlpfcwbqoeunesplqrsnexagiilm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914640.9919055-1815-207395459998302/AnsiballZ_command.py'
Dec 05 06:04:01 compute-0 sudo[178493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:01 compute-0 python3.9[178495]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:01 compute-0 sudo[178493]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:02 compute-0 sudo[178655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtpzcniimjdrttcvytpzchgdbtwgnkfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914642.1617362-1958-231311540767674/AnsiballZ_file.py'
Dec 05 06:04:02 compute-0 sudo[178655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:02 compute-0 podman[178620]: 2025-12-05 06:04:02.365546562 +0000 UTC m=+0.051883175 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4)
Dec 05 06:04:02 compute-0 python3.9[178661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:02 compute-0 sudo[178655]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:02 compute-0 podman[178665]: 2025-12-05 06:04:02.590508164 +0000 UTC m=+0.064706504 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:04:02 compute-0 sudo[178831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxxglwxdxrovkwkvlixobksiianpwtpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914642.6314995-1958-23432704758003/AnsiballZ_file.py'
Dec 05 06:04:02 compute-0 sudo[178831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:03 compute-0 python3.9[178833]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:03 compute-0 sudo[178831]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:03 compute-0 sudo[178983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrtanauixlqjrczzrtlhbkdscyrxexkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914643.1591814-1958-164657825742379/AnsiballZ_file.py'
Dec 05 06:04:03 compute-0 sudo[178983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:03 compute-0 python3.9[178985]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:03 compute-0 sudo[178983]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:03 compute-0 sudo[179135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkfelkigtccbyzncqnzjpwitovolxvpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914643.6978395-2002-3785264044457/AnsiballZ_file.py'
Dec 05 06:04:03 compute-0 sudo[179135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:04 compute-0 python3.9[179137]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:04 compute-0 sudo[179135]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:04 compute-0 sudo[179287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igacjfvldrfkwvkuonhlkhkwuegcdowj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914644.1134546-2002-154400862671844/AnsiballZ_file.py'
Dec 05 06:04:04 compute-0 sudo[179287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:04 compute-0 python3.9[179289]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:04 compute-0 sudo[179287]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:04 compute-0 sudo[179439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysgkxbgbjppydohigyrzzzecvdsakjoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914644.53491-2002-249821596878387/AnsiballZ_file.py'
Dec 05 06:04:04 compute-0 sudo[179439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:04 compute-0 python3.9[179441]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:04 compute-0 sudo[179439]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:05 compute-0 sudo[179591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtdagaqcleesixfnumqcomdgxwqhkvid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914644.9576464-2002-177622949356928/AnsiballZ_file.py'
Dec 05 06:04:05 compute-0 sudo[179591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:05 compute-0 python3.9[179593]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:05 compute-0 sudo[179591]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:05 compute-0 sudo[179743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lypqxcprgzfscjxitackfavhlgyvxtzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914645.379951-2002-142991329148923/AnsiballZ_file.py'
Dec 05 06:04:05 compute-0 sudo[179743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:05 compute-0 python3.9[179745]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:05 compute-0 sudo[179743]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:05 compute-0 sudo[179895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhwhcshdhiaunllrnduklbjpodpdomgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914645.8030548-2002-130263608376249/AnsiballZ_file.py'
Dec 05 06:04:05 compute-0 sudo[179895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:06 compute-0 python3.9[179897]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:06 compute-0 sudo[179895]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:06 compute-0 sudo[180047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibztsfzvlojpuxgjftgzpfvvofwwiuwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914646.2433481-2002-52601155513228/AnsiballZ_file.py'
Dec 05 06:04:06 compute-0 sudo[180047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:06 compute-0 python3.9[180049]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:06 compute-0 sudo[180047]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:10 compute-0 sudo[180199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wknxhiqozqdrtmfaeevyyhqxoiuiilmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914650.0318944-2239-18771962096236/AnsiballZ_getent.py'
Dec 05 06:04:10 compute-0 sudo[180199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:10 compute-0 python3.9[180201]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 05 06:04:10 compute-0 sudo[180199]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:10 compute-0 sudo[180352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrjoshwmmcydbuqpydowusplksaryotm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914650.5983624-2255-68322042623343/AnsiballZ_group.py'
Dec 05 06:04:10 compute-0 sudo[180352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:11 compute-0 python3.9[180354]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 06:04:11 compute-0 groupadd[180355]: group added to /etc/group: name=nova, GID=42436
Dec 05 06:04:11 compute-0 groupadd[180355]: group added to /etc/gshadow: name=nova
Dec 05 06:04:11 compute-0 groupadd[180355]: new group: name=nova, GID=42436
Dec 05 06:04:11 compute-0 sudo[180352]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:11 compute-0 sudo[180510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoqtipfsjbuwfraujjianqoppbwawukz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914651.2032216-2271-63383352893091/AnsiballZ_user.py'
Dec 05 06:04:11 compute-0 sudo[180510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:11 compute-0 python3.9[180512]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 06:04:11 compute-0 useradd[180514]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 05 06:04:11 compute-0 useradd[180514]: add 'nova' to group 'libvirt'
Dec 05 06:04:11 compute-0 useradd[180514]: add 'nova' to shadow group 'libvirt'
Dec 05 06:04:11 compute-0 sudo[180510]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:12 compute-0 sshd-session[180545]: Accepted publickey for zuul from 192.168.122.30 port 47126 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 06:04:12 compute-0 systemd-logind[745]: New session 24 of user zuul.
Dec 05 06:04:12 compute-0 systemd[1]: Started Session 24 of User zuul.
Dec 05 06:04:12 compute-0 sshd-session[180545]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 06:04:12 compute-0 sshd-session[180548]: Received disconnect from 192.168.122.30 port 47126:11: disconnected by user
Dec 05 06:04:12 compute-0 sshd-session[180548]: Disconnected from user zuul 192.168.122.30 port 47126
Dec 05 06:04:12 compute-0 sshd-session[180545]: pam_unix(sshd:session): session closed for user zuul
Dec 05 06:04:12 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Dec 05 06:04:12 compute-0 systemd-logind[745]: Session 24 logged out. Waiting for processes to exit.
Dec 05 06:04:12 compute-0 systemd-logind[745]: Removed session 24.
Dec 05 06:04:13 compute-0 python3.9[180698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:13 compute-0 python3.9[180819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914652.7150676-2321-1348124756914/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:13 compute-0 python3.9[180969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:14 compute-0 python3.9[181045]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:14 compute-0 python3.9[181195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:14 compute-0 python3.9[181316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914654.1946607-2321-17319426092392/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:15 compute-0 python3.9[181466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:15 compute-0 python3.9[181587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914654.9814165-2321-121122101752424/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:16 compute-0 python3.9[181737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:16 compute-0 python3.9[181858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914655.7612743-2321-44718101721351/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:16 compute-0 python3.9[182008]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:16 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 05 06:04:17 compute-0 python3.9[182130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914656.5409403-2321-77426781520980/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:17 compute-0 sudo[182280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irterpqbaifcknxgxxwtbfdxiekjpxba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914657.4980557-2487-79750444667010/AnsiballZ_file.py'
Dec 05 06:04:17 compute-0 sudo[182280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:17 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 06:04:17 compute-0 python3.9[182282]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:17 compute-0 sudo[182280]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:18 compute-0 sudo[182433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdjcdttdskvbucqgsqkwhgcertrrczaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914657.9812634-2503-176641957003453/AnsiballZ_copy.py'
Dec 05 06:04:18 compute-0 sudo[182433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:18 compute-0 python3.9[182435]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:18 compute-0 sudo[182433]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:18 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 05 06:04:18 compute-0 sudo[182586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prttmwwefqhhtzcejqnfrzofodsmadom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914658.4721289-2519-222946790377964/AnsiballZ_stat.py'
Dec 05 06:04:18 compute-0 sudo[182586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:18 compute-0 python3.9[182588]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:18 compute-0 sudo[182586]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:19 compute-0 sudo[182738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzorypwdrzbzyeaashehzjgyfrhskwby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914658.9446278-2535-166600122252454/AnsiballZ_stat.py'
Dec 05 06:04:19 compute-0 sudo[182738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:19 compute-0 python3.9[182740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:19 compute-0 sudo[182738]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:19 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 05 06:04:19 compute-0 sudo[182862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgtdrmmfrujoenstrblrbritfmajroks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914658.9446278-2535-166600122252454/AnsiballZ_copy.py'
Dec 05 06:04:19 compute-0 sudo[182862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:19 compute-0 python3.9[182864]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764914658.9446278-2535-166600122252454/.source _original_basename=.uw_34dt8 follow=False checksum=235590726ca6138bde7f57271e0a270b51a11402 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 05 06:04:19 compute-0 sudo[182862]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:20 compute-0 python3.9[183016]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:20 compute-0 python3.9[183168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:21 compute-0 python3.9[183289]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914660.4763358-2587-197646545034725/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=0c8a4148dad09ff77937f9ff6c2786e98772229f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:21 compute-0 python3.9[183439]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:22 compute-0 python3.9[183560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914661.3359537-2617-158176721191339/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0a49686d4a9232e73d6e1c3f99156f168a41e10f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:22 compute-0 sudo[183710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrzbozpkcizestcwwqvboqdrdiyrikoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914662.275348-2651-74139998737464/AnsiballZ_container_config_data.py'
Dec 05 06:04:22 compute-0 sudo[183710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:22 compute-0 python3.9[183712]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 05 06:04:22 compute-0 sudo[183710]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:23 compute-0 sudo[183862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uevafxnajliuvozlocivikpkazrqdoyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914662.8376315-2669-148330346377774/AnsiballZ_container_config_hash.py'
Dec 05 06:04:23 compute-0 sudo[183862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:23 compute-0 python3.9[183864]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 06:04:23 compute-0 sudo[183862]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:23 compute-0 sudo[184014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqhbfdanknnjyrsukpcgfpfdhlwpdbuk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914663.4448695-2689-30592064460891/AnsiballZ_edpm_container_manage.py'
Dec 05 06:04:23 compute-0 sudo[184014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:23 compute-0 python3[184016]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 06:04:23 compute-0 podman[184044]: 2025-12-05 06:04:23.958870209 +0000 UTC m=+0.027290346 container create a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27 (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute_init, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 05 06:04:23 compute-0 podman[184044]: 2025-12-05 06:04:23.946476117 +0000 UTC m=+0.014896265 image pull b8877984ba66cc23b3665a3bbc064555c69577db52335d87bee5ea0e0b4830bc quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current
Dec 05 06:04:23 compute-0 python3[184016]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 05 06:04:24 compute-0 sudo[184014]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:24 compute-0 sudo[184230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikekjpyfloqheafsvveyzqauuutfagkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914664.164348-2705-147691673119151/AnsiballZ_stat.py'
Dec 05 06:04:24 compute-0 sudo[184230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:24 compute-0 podman[184195]: 2025-12-05 06:04:24.377692426 +0000 UTC m=+0.063382016 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:04:24 compute-0 python3.9[184237]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:24 compute-0 sudo[184230]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:25 compute-0 sudo[184398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbfvkzytazhctwcisseojtnouxhvoyho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914664.8710816-2729-223029238028398/AnsiballZ_container_config_data.py'
Dec 05 06:04:25 compute-0 sudo[184398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:25 compute-0 python3.9[184400]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 05 06:04:25 compute-0 sudo[184398]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:25 compute-0 sudo[184550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzjurulzkhbgawagffuxnxzzmvqlktez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914665.3875167-2747-264359277654116/AnsiballZ_container_config_hash.py'
Dec 05 06:04:25 compute-0 sudo[184550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:25 compute-0 python3.9[184552]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 06:04:25 compute-0 sudo[184550]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:26 compute-0 sudo[184702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xntbuagyfrjvvbbznfnwqzkdnumfpnmg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914665.9592295-2767-48029111372078/AnsiballZ_edpm_container_manage.py'
Dec 05 06:04:26 compute-0 sudo[184702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:26 compute-0 python3[184704]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 06:04:26 compute-0 podman[184734]: 2025-12-05 06:04:26.458723361 +0000 UTC m=+0.027421084 container create b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:04:26 compute-0 podman[184734]: 2025-12-05 06:04:26.445259649 +0000 UTC m=+0.013957391 image pull b8877984ba66cc23b3665a3bbc064555c69577db52335d87bee5ea0e0b4830bc quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current
Dec 05 06:04:26 compute-0 python3[184704]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current kolla_start
Dec 05 06:04:26 compute-0 sudo[184702]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:26 compute-0 sudo[184911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-recogmdlqaehmkotofhrnlxvqtaqdlnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914666.6838582-2783-137618016071810/AnsiballZ_stat.py'
Dec 05 06:04:26 compute-0 sudo[184911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:26 compute-0 python3.9[184913]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:27 compute-0 sudo[184911]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:27 compute-0 sudo[185065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxwonsgxreewscpopotbhuysaufzepku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914667.2116516-2801-157191652682732/AnsiballZ_file.py'
Dec 05 06:04:27 compute-0 sudo[185065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:27 compute-0 python3.9[185067]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:27 compute-0 sudo[185065]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:27 compute-0 sudo[185216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wklrqsnmyovyuzthuazucfdmhrhnlvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914667.5843894-2801-9346853914400/AnsiballZ_copy.py'
Dec 05 06:04:27 compute-0 sudo[185216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:27 compute-0 python3.9[185218]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914667.5843894-2801-9346853914400/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:27 compute-0 sudo[185216]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:28 compute-0 sudo[185292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srkobefsdzxemrfploxbpmqotdcwvowq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914667.5843894-2801-9346853914400/AnsiballZ_systemd.py'
Dec 05 06:04:28 compute-0 sudo[185292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:28 compute-0 python3.9[185294]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:04:28 compute-0 systemd[1]: Reloading.
Dec 05 06:04:28 compute-0 systemd-rc-local-generator[185316]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:04:28 compute-0 systemd-sysv-generator[185319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:04:28 compute-0 sudo[185292]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:28 compute-0 sudo[185403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fygdfgsoxrudriuytxzqbcllopytzzyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914667.5843894-2801-9346853914400/AnsiballZ_systemd.py'
Dec 05 06:04:28 compute-0 sudo[185403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:28 compute-0 python3.9[185405]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:04:29 compute-0 systemd[1]: Reloading.
Dec 05 06:04:29 compute-0 systemd-rc-local-generator[185428]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:04:29 compute-0 systemd-sysv-generator[185432]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:04:29 compute-0 systemd[1]: Starting nova_compute container...
Dec 05 06:04:29 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:29 compute-0 podman[185445]: 2025-12-05 06:04:29.327314987 +0000 UTC m=+0.072654040 container init b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:04:29 compute-0 podman[185445]: 2025-12-05 06:04:29.332951068 +0000 UTC m=+0.078290102 container start b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm)
Dec 05 06:04:29 compute-0 podman[185445]: nova_compute
Dec 05 06:04:29 compute-0 nova_compute[185457]: + sudo -E kolla_set_configs
Dec 05 06:04:29 compute-0 systemd[1]: Started nova_compute container.
Dec 05 06:04:29 compute-0 sudo[185403]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Validating config file
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying service configuration files
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Deleting /etc/ceph
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Creating directory /etc/ceph
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Writing out command to execute
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:29 compute-0 nova_compute[185457]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 06:04:29 compute-0 nova_compute[185457]: ++ cat /run_command
Dec 05 06:04:29 compute-0 nova_compute[185457]: + CMD=nova-compute
Dec 05 06:04:29 compute-0 nova_compute[185457]: + ARGS=
Dec 05 06:04:29 compute-0 nova_compute[185457]: + sudo kolla_copy_cacerts
Dec 05 06:04:29 compute-0 nova_compute[185457]: + [[ ! -n '' ]]
Dec 05 06:04:29 compute-0 nova_compute[185457]: + . kolla_extend_start
Dec 05 06:04:29 compute-0 nova_compute[185457]: Running command: 'nova-compute'
Dec 05 06:04:29 compute-0 nova_compute[185457]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 06:04:29 compute-0 nova_compute[185457]: + umask 0022
Dec 05 06:04:29 compute-0 nova_compute[185457]: + exec nova-compute
Dec 05 06:04:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:04:29.483 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:04:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:04:29.484 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:04:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:04:29.484 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:04:30 compute-0 python3.9[185619]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:30 compute-0 python3.9[185769]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.084 185461 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.084 185461 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.084 185461 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.084 185461 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 06:04:31 compute-0 python3.9[185919]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.169 185461 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.179 185461 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.009s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.179 185461 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.207 185461 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 05 06:04:31 compute-0 nova_compute[185457]: 2025-12-05 06:04:31.208 185461 WARNING oslo_config.cfg [None req-922f4edc-fa96-4ab7-9d07-62db57f45a00 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 05 06:04:31 compute-0 sudo[186072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aktylyohhqqgmxvbsbegpsqzwtleposi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914671.3586485-2921-126436860123174/AnsiballZ_podman_container.py'
Dec 05 06:04:31 compute-0 sudo[186072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:31 compute-0 python3.9[186074]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 06:04:31 compute-0 sudo[186072]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:31 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 06:04:32 compute-0 nova_compute[185457]: 2025-12-05 06:04:32.129 185461 INFO nova.virt.driver [None req-922f4edc-fa96-4ab7-9d07-62db57f45a00 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 06:04:32 compute-0 nova_compute[185457]: 2025-12-05 06:04:32.200 185461 INFO nova.compute.provider_config [None req-922f4edc-fa96-4ab7-9d07-62db57f45a00 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 06:04:32 compute-0 sudo[186245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inngytgyyztjutnlpmjgbagozwpnrgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914672.11998-2937-174133045505967/AnsiballZ_systemd.py'
Dec 05 06:04:32 compute-0 sudo[186245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:32 compute-0 podman[186248]: 2025-12-05 06:04:32.461245986 +0000 UTC m=+0.043868554 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:04:32 compute-0 python3.9[186247]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:04:32 compute-0 systemd[1]: Stopping nova_compute container...
Dec 05 06:04:32 compute-0 podman[186268]: 2025-12-05 06:04:32.660400568 +0000 UTC m=+0.040749362 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:04:32 compute-0 systemd[1]: libpod-b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d.scope: Deactivated successfully.
Dec 05 06:04:32 compute-0 systemd[1]: libpod-b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d.scope: Consumed 2.000s CPU time.
Dec 05 06:04:32 compute-0 podman[186269]: 2025-12-05 06:04:32.669503215 +0000 UTC m=+0.047267561 container died b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:04:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d-userdata-shm.mount: Deactivated successfully.
Dec 05 06:04:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17-merged.mount: Deactivated successfully.
Dec 05 06:04:32 compute-0 podman[186269]: 2025-12-05 06:04:32.699105747 +0000 UTC m=+0.076870093 container cleanup b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Dec 05 06:04:32 compute-0 podman[186269]: nova_compute
Dec 05 06:04:32 compute-0 podman[186308]: nova_compute
Dec 05 06:04:32 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 05 06:04:32 compute-0 systemd[1]: Stopped nova_compute container.
Dec 05 06:04:32 compute-0 systemd[1]: Starting nova_compute container...
Dec 05 06:04:32 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:04:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd12aa21a17b981f99a00a9a870f1c28419d6e632e46be4bbeb6d022d18e7a17/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:32 compute-0 podman[186317]: 2025-12-05 06:04:32.822302742 +0000 UTC m=+0.061975271 container init b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 05 06:04:32 compute-0 podman[186317]: 2025-12-05 06:04:32.826622331 +0000 UTC m=+0.066294859 container start b158f98172b51bae5faadce7ee10eeeef59369cf61de1b57aef88ffe67ebdd9d (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute, config_id=edpm, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:04:32 compute-0 podman[186317]: nova_compute
Dec 05 06:04:32 compute-0 nova_compute[186329]: + sudo -E kolla_set_configs
Dec 05 06:04:32 compute-0 systemd[1]: Started nova_compute container.
Dec 05 06:04:32 compute-0 sudo[186245]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Validating config file
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying service configuration files
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /etc/ceph
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Creating directory /etc/ceph
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /etc/ceph
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Writing out command to execute
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:32 compute-0 nova_compute[186329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 05 06:04:32 compute-0 nova_compute[186329]: ++ cat /run_command
Dec 05 06:04:32 compute-0 nova_compute[186329]: + CMD=nova-compute
Dec 05 06:04:32 compute-0 nova_compute[186329]: + ARGS=
Dec 05 06:04:32 compute-0 nova_compute[186329]: + sudo kolla_copy_cacerts
Dec 05 06:04:32 compute-0 nova_compute[186329]: Running command: 'nova-compute'
Dec 05 06:04:32 compute-0 nova_compute[186329]: + [[ ! -n '' ]]
Dec 05 06:04:32 compute-0 nova_compute[186329]: + . kolla_extend_start
Dec 05 06:04:32 compute-0 nova_compute[186329]: + echo 'Running command: '\''nova-compute'\'''
Dec 05 06:04:32 compute-0 nova_compute[186329]: + umask 0022
Dec 05 06:04:32 compute-0 nova_compute[186329]: + exec nova-compute
Dec 05 06:04:33 compute-0 sudo[186490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nafcyfwsltzhcgdccrtzqhicabfgosav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914673.0136316-2955-145871655926519/AnsiballZ_podman_container.py'
Dec 05 06:04:33 compute-0 sudo[186490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:33 compute-0 python3.9[186492]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 05 06:04:33 compute-0 systemd[1]: Started libpod-conmon-a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27.scope.
Dec 05 06:04:33 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:04:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d16eb72644414fee47fd6914b8b5a873312e849409f713b8a5ccc6ec8fe6e33c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d16eb72644414fee47fd6914b8b5a873312e849409f713b8a5ccc6ec8fe6e33c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d16eb72644414fee47fd6914b8b5a873312e849409f713b8a5ccc6ec8fe6e33c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 05 06:04:33 compute-0 podman[186508]: 2025-12-05 06:04:33.563010189 +0000 UTC m=+0.095941275 container init a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27 (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=nova_compute_init)
Dec 05 06:04:33 compute-0 podman[186508]: 2025-12-05 06:04:33.568130822 +0000 UTC m=+0.101061897 container start a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27 (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:04:33 compute-0 python3.9[186492]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Applying nova statedir ownership
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 05 06:04:33 compute-0 nova_compute_init[186526]: INFO:nova_statedir:Nova statedir ownership complete
Dec 05 06:04:33 compute-0 systemd[1]: libpod-a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27.scope: Deactivated successfully.
Dec 05 06:04:33 compute-0 podman[186536]: 2025-12-05 06:04:33.653241361 +0000 UTC m=+0.025939309 container died a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27 (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:04:33 compute-0 podman[186536]: 2025-12-05 06:04:33.66932675 +0000 UTC m=+0.042024678 container cleanup a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27 (image=quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current, name=nova_compute_init, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-nova-compute:current', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:04:33 compute-0 systemd[1]: libpod-conmon-a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27.scope: Deactivated successfully.
Dec 05 06:04:33 compute-0 sudo[186490]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8ef45ef64c498c54881bdb2946b0a1fc50370d59197b36e8577067e7297cd27-userdata-shm.mount: Deactivated successfully.
Dec 05 06:04:34 compute-0 sshd-session[158520]: Connection closed by 192.168.122.30 port 33696
Dec 05 06:04:34 compute-0 sshd-session[158517]: pam_unix(sshd:session): session closed for user zuul
Dec 05 06:04:34 compute-0 systemd-logind[745]: Session 23 logged out. Waiting for processes to exit.
Dec 05 06:04:34 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Dec 05 06:04:34 compute-0 systemd[1]: session-23.scope: Consumed 1min 16.809s CPU time.
Dec 05 06:04:34 compute-0 systemd-logind[745]: Removed session 23.
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.581 186333 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.581 186333 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.582 186333 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.582 186333 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.670 186333 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.680 186333 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.680 186333 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:423
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.708 186333 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Dec 05 06:04:34 compute-0 nova_compute[186329]: 2025-12-05 06:04:34.710 186333 WARNING oslo_config.cfg [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Dec 05 06:04:35 compute-0 nova_compute[186329]: 2025-12-05 06:04:35.597 186333 INFO nova.virt.driver [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 05 06:04:35 compute-0 nova_compute[186329]: 2025-12-05 06:04:35.666 186333 INFO nova.compute.provider_config [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.170 186333 DEBUG oslo_concurrency.lockutils [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.170 186333 DEBUG oslo_concurrency.lockutils [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_concurrency.lockutils [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.171 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.172 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.173 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.174 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.175 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.176 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.177 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.178 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.179 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.180 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.181 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.182 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.183 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.184 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.185 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.186 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.187 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.188 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.189 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.190 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.191 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.192 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.193 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.194 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.195 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.196 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.197 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.198 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.199 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.200 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.201 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.202 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.203 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.204 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.205 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.206 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.207 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.208 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.209 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.210 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.211 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.211 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.211 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.211 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.211 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.212 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.212 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.212 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.213 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.214 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.215 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.216 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.217 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.218 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.219 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.220 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.221 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.222 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.223 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.224 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.225 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.226 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.227 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.228 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.229 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.230 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.231 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.232 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.233 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.234 186333 WARNING oslo_config.cfg [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 05 06:04:36 compute-0 nova_compute[186329]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 05 06:04:36 compute-0 nova_compute[186329]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 05 06:04:36 compute-0 nova_compute[186329]: and ``live_migration_inbound_addr`` respectively.
Dec 05 06:04:36 compute-0 nova_compute[186329]: ).  Its value may be silently ignored in the future.
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.235 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.236 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.237 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.238 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.239 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.240 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.241 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.242 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.243 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.244 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.245 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.246 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.247 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.248 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.249 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.250 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.251 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.252 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.253 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.254 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.255 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.256 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.257 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.258 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.259 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.260 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.261 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.262 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.263 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.264 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.265 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.266 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.267 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.268 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.269 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.270 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.271 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.272 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.273 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.274 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.275 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.276 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.277 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.278 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.279 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.280 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.281 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.282 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.283 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.284 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.285 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.thread_pool_size = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.286 186333 DEBUG oslo_service.backend._eventlet.service [None req-2d4515f7-281e-4ccd-ae8d-2261c3883e18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.287 186333 INFO nova.service [-] Starting compute node (version 32.1.0-0.20251105112212.710ffbb.el10)
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.791 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Dec 05 06:04:36 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 05 06:04:36 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.838 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7f72f412e0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Dec 05 06:04:36 compute-0 nova_compute[186329]: libvirt:  error : internal error: could not initialize domain event timer
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.838 186333 WARNING nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.839 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7f72f412e0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.840 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.840 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.840 186333 INFO nova.utils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] The default thread pool MainProcess.default is initialized
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.841 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Dec 05 06:04:36 compute-0 nova_compute[186329]: 2025-12-05 06:04:36.841 186333 INFO nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Connection event '1' reason 'None'
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.345 186333 WARNING nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.346 186333 DEBUG nova.virt.libvirt.volume.mount [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.511 186333 INFO nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt host capabilities <capabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]: 
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <host>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <uuid>5b9d8781-3e9f-4035-9ba6-9ba6577c62f7</uuid>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <arch>x86_64</arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model>EPYC-Milan-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <vendor>AMD</vendor>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <microcode version='167776725'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <signature family='25' model='1' stepping='1'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <topology sockets='4' dies='1' clusters='1' cores='1' threads='1'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <maxphysaddr mode='emulate' bits='48'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='x2apic'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='tsc-deadline'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='osxsave'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='hypervisor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='tsc_adjust'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='ospke'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='vaes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='vpclmulqdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='spec-ctrl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='stibp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='arch-capabilities'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='cmp_legacy'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='virt-ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='lbrv'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='tsc-scale'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='vmcb-clean'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='pause-filter'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='pfthreshold'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='v-vmsave-vmload'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='vgif'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='rdctl-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='skip-l1dfl-vmentry'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='mds-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature name='pschange-mc-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <pages unit='KiB' size='4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <pages unit='KiB' size='2048'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <pages unit='KiB' size='1048576'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <power_management>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <suspend_mem/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <suspend_disk/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <suspend_hybrid/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </power_management>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <iommu support='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <migration_features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <live/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <uri_transports>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <uri_transport>tcp</uri_transport>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <uri_transport>rdma</uri_transport>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </uri_transports>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </migration_features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <topology>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <cells num='1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <cell id='0'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <memory unit='KiB'>7865368</memory>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <pages unit='KiB' size='4'>1966342</pages>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <pages unit='KiB' size='2048'>0</pages>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <distances>
Dec 05 06:04:37 compute-0 nova_compute[186329]:             <sibling id='0' value='10'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           </distances>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           <cpus num='4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:           </cpus>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         </cell>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </cells>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </topology>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <cache>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </cache>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <secmodel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model>selinux</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <doi>0</doi>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </secmodel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <secmodel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model>dac</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <doi>0</doi>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </secmodel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </host>
Dec 05 06:04:37 compute-0 nova_compute[186329]: 
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <guest>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <os_type>hvm</os_type>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <arch name='i686'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <wordsize>32</wordsize>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <domain type='qemu'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <domain type='kvm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <pae/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <nonpae/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <acpi default='on' toggle='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <apic default='on' toggle='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <cpuselection/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <deviceboot/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <disksnapshot default='on' toggle='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <externalSnapshot/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </guest>
Dec 05 06:04:37 compute-0 nova_compute[186329]: 
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <guest>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <os_type>hvm</os_type>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <arch name='x86_64'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <wordsize>64</wordsize>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <domain type='qemu'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <domain type='kvm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <acpi default='on' toggle='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <apic default='on' toggle='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <cpuselection/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <deviceboot/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <disksnapshot default='on' toggle='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <externalSnapshot/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </guest>
Dec 05 06:04:37 compute-0 nova_compute[186329]: 
Dec 05 06:04:37 compute-0 nova_compute[186329]: </capabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]: 
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.515 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.527 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 05 06:04:37 compute-0 nova_compute[186329]: <domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <domain>kvm</domain>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <arch>i686</arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <vcpu max='4096'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <iothreads supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <os supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='firmware'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <loader supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>rom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pflash</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='readonly'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>yes</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='secure'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </loader>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </os>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-passthrough' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='hostPassthroughMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='maximum' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='maximumMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-model' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model fallback='forbid'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <vendor>AMD</vendor>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <maxphysaddr mode='passthrough' limit='48'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='x2apic'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='hypervisor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vaes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vpclmulqdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='stibp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='overflow-recov'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='succor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lbrv'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-scale'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='flushbyasid'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pause-filter'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pfthreshold'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='v-vmsave-vmload'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vgif'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='custom' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Milan-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-128'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-256'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-512'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v6'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v7'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <memoryBacking supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='sourceType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>anonymous</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>memfd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </memoryBacking>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <disk supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='diskDevice'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>disk</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cdrom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>floppy</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>lun</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>fdc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>sata</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <graphics supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vnc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egl-headless</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <video supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='modelType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vga</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cirrus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>none</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>bochs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ramfb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </video>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hostdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='mode'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>subsystem</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='startupPolicy'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>mandatory</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>requisite</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>optional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='subsysType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pci</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='capsType'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='pciBackend'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hostdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <rng supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>random</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <filesystem supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='driverType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>path</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>handle</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtiofs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </filesystem>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <tpm supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-tis</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-crb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emulator</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>external</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendVersion'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>2.0</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </tpm>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <redirdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </redirdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <channel supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </channel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <crypto supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </crypto>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <interface supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>passt</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <panic supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>isa</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>hyperv</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </panic>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <console supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>null</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dev</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pipe</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stdio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>udp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tcp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu-vdagent</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </console>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <gic supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <vmcoreinfo supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <genid supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backingStoreInput supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backup supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <async-teardown supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <ps2 supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sev supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sgx supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hyperv supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='features'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>relaxed</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vapic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>spinlocks</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vpindex</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>runtime</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>synic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stimer</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reset</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vendor_id</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>frequencies</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reenlightenment</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tlbflush</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ipi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>avic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emsr_bitmap</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>xmm_input</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <spinlocks>4095</spinlocks>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <stimer_direct>on</stimer_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hyperv>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <launchSecurity supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='sectype'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tdx</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </launchSecurity>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]: </domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.530 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 05 06:04:37 compute-0 nova_compute[186329]: <domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <domain>kvm</domain>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <arch>i686</arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <vcpu max='240'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <iothreads supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <os supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='firmware'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <loader supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>rom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pflash</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='readonly'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>yes</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='secure'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </loader>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </os>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-passthrough' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='hostPassthroughMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='maximum' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='maximumMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-model' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model fallback='forbid'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <vendor>AMD</vendor>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <maxphysaddr mode='passthrough' limit='48'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='x2apic'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='hypervisor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vaes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vpclmulqdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='stibp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='overflow-recov'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='succor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lbrv'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-scale'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='flushbyasid'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pause-filter'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pfthreshold'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='v-vmsave-vmload'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vgif'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='custom' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Milan-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-128'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-256'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-512'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v6'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v7'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <memoryBacking supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='sourceType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>anonymous</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>memfd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </memoryBacking>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <disk supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='diskDevice'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>disk</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cdrom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>floppy</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>lun</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ide</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>fdc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>sata</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <graphics supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vnc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egl-headless</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <video supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='modelType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vga</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cirrus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>none</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>bochs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ramfb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </video>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hostdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='mode'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>subsystem</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='startupPolicy'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>mandatory</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>requisite</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>optional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='subsysType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pci</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='capsType'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='pciBackend'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hostdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <rng supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>random</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <filesystem supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='driverType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>path</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>handle</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtiofs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </filesystem>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <tpm supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-tis</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-crb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emulator</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>external</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendVersion'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>2.0</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </tpm>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <redirdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </redirdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <channel supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </channel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <crypto supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </crypto>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <interface supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>passt</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <panic supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>isa</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>hyperv</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </panic>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <console supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>null</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dev</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pipe</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stdio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>udp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tcp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu-vdagent</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </console>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <gic supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <vmcoreinfo supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <genid supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backingStoreInput supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backup supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <async-teardown supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <ps2 supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sev supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sgx supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hyperv supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='features'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>relaxed</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vapic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>spinlocks</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vpindex</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>runtime</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>synic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stimer</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reset</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vendor_id</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>frequencies</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reenlightenment</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tlbflush</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ipi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>avic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emsr_bitmap</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>xmm_input</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <spinlocks>4095</spinlocks>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <stimer_direct>on</stimer_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hyperv>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <launchSecurity supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='sectype'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tdx</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </launchSecurity>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]: </domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.532 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.533 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 05 06:04:37 compute-0 nova_compute[186329]: <domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <domain>kvm</domain>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <arch>x86_64</arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <vcpu max='4096'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <iothreads supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <os supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='firmware'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>efi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <loader supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>rom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pflash</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='readonly'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>yes</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='secure'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>yes</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </loader>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </os>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-passthrough' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='hostPassthroughMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='maximum' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='maximumMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-model' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model fallback='forbid'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <vendor>AMD</vendor>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <maxphysaddr mode='passthrough' limit='48'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='x2apic'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='hypervisor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vaes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vpclmulqdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='stibp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='overflow-recov'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='succor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lbrv'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-scale'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='flushbyasid'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pause-filter'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pfthreshold'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='v-vmsave-vmload'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vgif'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='custom' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Milan-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-128'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-256'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-512'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v6'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v7'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <memoryBacking supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='sourceType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>anonymous</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>memfd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </memoryBacking>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <disk supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='diskDevice'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>disk</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cdrom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>floppy</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>lun</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>fdc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>sata</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <graphics supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vnc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egl-headless</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <video supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='modelType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vga</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cirrus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>none</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>bochs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ramfb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </video>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hostdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='mode'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>subsystem</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='startupPolicy'>
Dec 05 06:04:37 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>mandatory</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>requisite</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>optional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='subsysType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pci</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='capsType'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='pciBackend'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hostdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <rng supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>random</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <filesystem supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='driverType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>path</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>handle</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtiofs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </filesystem>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <tpm supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-tis</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-crb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emulator</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>external</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendVersion'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>2.0</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </tpm>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <redirdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </redirdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <channel supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </channel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <crypto supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </crypto>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <interface supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>passt</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <panic supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>isa</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>hyperv</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </panic>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <console supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>null</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dev</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pipe</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stdio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>udp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tcp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu-vdagent</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </console>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <gic supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <vmcoreinfo supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <genid supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backingStoreInput supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backup supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <async-teardown supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <ps2 supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sev supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sgx supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hyperv supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='features'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>relaxed</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vapic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>spinlocks</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vpindex</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>runtime</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>synic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stimer</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reset</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vendor_id</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>frequencies</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reenlightenment</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tlbflush</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ipi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>avic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emsr_bitmap</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>xmm_input</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <spinlocks>4095</spinlocks>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <stimer_direct>on</stimer_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hyperv>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <launchSecurity supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='sectype'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tdx</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </launchSecurity>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]: </domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.577 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 05 06:04:37 compute-0 nova_compute[186329]: <domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <path>/usr/libexec/qemu-kvm</path>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <domain>kvm</domain>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <arch>x86_64</arch>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <vcpu max='240'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <iothreads supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <os supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='firmware'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <loader supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>rom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pflash</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='readonly'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>yes</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='secure'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>no</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </loader>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </os>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-passthrough' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='hostPassthroughMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='maximum' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='maximumMigratable'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>on</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>off</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='host-model' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model fallback='forbid'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <vendor>AMD</vendor>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <maxphysaddr mode='passthrough' limit='48'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='x2apic'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-deadline'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='hypervisor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc_adjust'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vaes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vpclmulqdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='spec-ctrl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='stibp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='cmp_legacy'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='overflow-recov'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='succor'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='virt-ssbd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lbrv'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='tsc-scale'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vmcb-clean'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='flushbyasid'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pause-filter'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='pfthreshold'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='v-vmsave-vmload'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='vgif'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <mode name='custom' supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Broadwell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Broadwell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cascadelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Cooperlake-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Denverton-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Denverton-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Hygon'>Dhyana-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Genoa-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='auto-ibrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Milan-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='EPYC-Milan-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amd-psfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='no-nested-data-bp'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='null-sel-clr-base'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='stibp-always-on'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='AMD'>EPYC-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='GraniteRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-128'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-256'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx10-512'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='prefetchiti'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Haswell-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Haswell-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-noTSX'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v6'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Icelake-Server-v7'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>IvyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='KnightsMill-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4fmaps'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-4vnniw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512er'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512pf'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G4-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Opteron_G5-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fma4'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tbm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xop'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SapphireRapids-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='amx-tile'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-bf16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-fp16'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512-vpopcntdq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bitalg'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vbmi2'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrc'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fzrm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='la57'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='taa-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='tsx-ldtrk'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='xfd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='SierraForest-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ifma'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-ne-convert'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx-vnni-int8'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='bus-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cmpccxadd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fbsdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='fsrs'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ibrs-all'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mcdt-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='pbrsb-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='psdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='sbdr-ssdp-no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='serialize'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Client-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Skylake-Client-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='hle'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='rtm'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Skylake-Server-v5'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512bw'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512cd'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512dq'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512f'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='avx512vl'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='mpx'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v2'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v3'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='core-capability'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='split-lock-detect'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='Snowridge-v4'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='cldemote'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='gfni'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdir64b'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='movdiri'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='athlon-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='core2duo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='coreduo-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='n270-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='ss'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <blockers model='phenom-v1'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnow'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <feature name='3dnowext'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </blockers>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </mode>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <memoryBacking supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <enum name='sourceType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>anonymous</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <value>memfd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </memoryBacking>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <disk supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='diskDevice'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>disk</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cdrom</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>floppy</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>lun</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ide</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>fdc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>sata</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <graphics supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vnc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egl-headless</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <video supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='modelType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vga</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>cirrus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>none</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>bochs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ramfb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </video>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hostdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='mode'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>subsystem</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='startupPolicy'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>mandatory</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>requisite</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>optional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='subsysType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pci</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>scsi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='capsType'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='pciBackend'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hostdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <rng supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtio-non-transitional</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>random</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>egd</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <filesystem supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='driverType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>path</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>handle</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>virtiofs</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </filesystem>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <tpm supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-tis</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tpm-crb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emulator</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>external</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendVersion'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>2.0</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </tpm>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <redirdev supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='bus'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>usb</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </redirdev>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <channel supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </channel>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <crypto supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendModel'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>builtin</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </crypto>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <interface supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='backendType'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>default</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>passt</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <panic supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='model'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>isa</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>hyperv</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </panic>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <console supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='type'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>null</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vc</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pty</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dev</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>file</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>pipe</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stdio</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>udp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tcp</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>unix</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>qemu-vdagent</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>dbus</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </console>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <features>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <gic supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <vmcoreinfo supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <genid supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backingStoreInput supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <backup supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <async-teardown supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <ps2 supported='yes'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sev supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <sgx supported='no'/>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <hyperv supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='features'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>relaxed</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vapic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>spinlocks</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vpindex</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>runtime</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>synic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>stimer</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reset</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>vendor_id</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>frequencies</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>reenlightenment</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tlbflush</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>ipi</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>avic</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>emsr_bitmap</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>xmm_input</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <spinlocks>4095</spinlocks>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <stimer_direct>on</stimer_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_direct>on</tlbflush_direct>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <tlbflush_extended>on</tlbflush_extended>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </defaults>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </hyperv>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     <launchSecurity supported='yes'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       <enum name='sectype'>
Dec 05 06:04:37 compute-0 nova_compute[186329]:         <value>tdx</value>
Dec 05 06:04:37 compute-0 nova_compute[186329]:       </enum>
Dec 05 06:04:37 compute-0 nova_compute[186329]:     </launchSecurity>
Dec 05 06:04:37 compute-0 nova_compute[186329]:   </features>
Dec 05 06:04:37 compute-0 nova_compute[186329]: </domainCapabilities>
Dec 05 06:04:37 compute-0 nova_compute[186329]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.619 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.620 186333 INFO nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Secure Boot support detected
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.623 186333 INFO nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.623 186333 INFO nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.777 186333 DEBUG nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] cpu compare xml: <cpu match="exact">
Dec 05 06:04:37 compute-0 nova_compute[186329]:   <model>Nehalem</model>
Dec 05 06:04:37 compute-0 nova_compute[186329]: </cpu>
Dec 05 06:04:37 compute-0 nova_compute[186329]:  _compare_cpu /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10922
Dec 05 06:04:37 compute-0 nova_compute[186329]: 2025-12-05 06:04:37.778 186333 DEBUG nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Dec 05 06:04:38 compute-0 nova_compute[186329]: 2025-12-05 06:04:38.285 186333 INFO nova.virt.node [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Determined node identity f2df025e-56e9-4920-9fad-1a12202c4aeb from /var/lib/nova/compute_id
Dec 05 06:04:38 compute-0 nova_compute[186329]: 2025-12-05 06:04:38.790 186333 WARNING nova.compute.manager [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Compute nodes ['f2df025e-56e9-4920-9fad-1a12202c4aeb'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 05 06:04:39 compute-0 sshd-session[186672]: Accepted publickey for zuul from 192.168.122.30 port 55452 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 06:04:39 compute-0 systemd-logind[745]: New session 25 of user zuul.
Dec 05 06:04:39 compute-0 systemd[1]: Started Session 25 of User zuul.
Dec 05 06:04:39 compute-0 sshd-session[186672]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 06:04:39 compute-0 nova_compute[186329]: 2025-12-05 06:04:39.797 186333 INFO nova.compute.manager [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 05 06:04:39 compute-0 python3.9[186825]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 05 06:04:40 compute-0 sudo[186979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akdoctwmaaxlkrobfbeofjviddvjrhee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914680.3653998-52-108248342033954/AnsiballZ_systemd_service.py'
Dec 05 06:04:40 compute-0 sudo[186979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.808 186333 WARNING nova.compute.manager [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.808 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.809 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.809 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.809 186333 DEBUG nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.971 186333 WARNING nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.972 186333 DEBUG oslo_concurrency.processutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.985 186333 DEBUG oslo_concurrency.processutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.985 186333 DEBUG nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6192MB free_disk=73.36912155151367GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.985 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:04:40 compute-0 nova_compute[186329]: 2025-12-05 06:04:40.986 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:04:41 compute-0 python3.9[186981]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:04:41 compute-0 systemd[1]: Reloading.
Dec 05 06:04:41 compute-0 systemd-sysv-generator[187008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:04:41 compute-0 systemd-rc-local-generator[187003]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:04:41 compute-0 sudo[186979]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:41 compute-0 nova_compute[186329]: 2025-12-05 06:04:41.491 186333 WARNING nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] No compute node record for compute-0.ctlplane.example.com:f2df025e-56e9-4920-9fad-1a12202c4aeb: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host f2df025e-56e9-4920-9fad-1a12202c4aeb could not be found.
Dec 05 06:04:41 compute-0 python3.9[187168]: ansible-ansible.builtin.service_facts Invoked
Dec 05 06:04:41 compute-0 network[187185]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 05 06:04:41 compute-0 network[187186]: 'network-scripts' will be removed from distribution in near future.
Dec 05 06:04:41 compute-0 network[187187]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 05 06:04:41 compute-0 nova_compute[186329]: 2025-12-05 06:04:41.998 186333 INFO nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: f2df025e-56e9-4920-9fad-1a12202c4aeb
Dec 05 06:04:43 compute-0 nova_compute[186329]: 2025-12-05 06:04:43.525 186333 DEBUG nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:04:43 compute-0 nova_compute[186329]: 2025-12-05 06:04:43.526 186333 DEBUG nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:04:40 up 42 min,  0 user,  load average: 0.87, 0.75, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.029 186333 INFO nova.scheduler.client.report [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] [req-c30eb4f6-f7a6-46f6-9df4-ab303215d330] Created resource provider record via placement API for resource provider with UUID f2df025e-56e9-4920-9fad-1a12202c4aeb and name compute-0.ctlplane.example.com.
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.048 186333 DEBUG nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 05 06:04:44 compute-0 nova_compute[186329]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.048 186333 INFO nova.virt.libvirt.host [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] kernel doesn't support AMD SEV
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.048 186333 DEBUG nova.compute.provider_tree [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.048 186333 DEBUG nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.050 186333 DEBUG nova.virt.libvirt.driver [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Libvirt baseline CPU <cpu>
Dec 05 06:04:44 compute-0 nova_compute[186329]:   <arch>x86_64</arch>
Dec 05 06:04:44 compute-0 nova_compute[186329]:   <model>Nehalem</model>
Dec 05 06:04:44 compute-0 nova_compute[186329]:   <vendor>AMD</vendor>
Dec 05 06:04:44 compute-0 nova_compute[186329]:   <topology sockets="4" cores="1" threads="1"/>
Dec 05 06:04:44 compute-0 nova_compute[186329]:   <maxphysaddr mode="emulate" bits="48"/>
Dec 05 06:04:44 compute-0 nova_compute[186329]: </cpu>
Dec 05 06:04:44 compute-0 nova_compute[186329]:  _get_guest_baseline_cpu_features /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13545
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.583 186333 DEBUG nova.scheduler.client.report [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Updated inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.583 186333 DEBUG nova.compute.provider_tree [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Updating resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.583 186333 DEBUG nova.compute.provider_tree [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:04:44 compute-0 nova_compute[186329]: 2025-12-05 06:04:44.741 186333 DEBUG nova.compute.provider_tree [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Updating resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 05 06:04:45 compute-0 nova_compute[186329]: 2025-12-05 06:04:45.246 186333 DEBUG nova.compute.resource_tracker [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:04:45 compute-0 nova_compute[186329]: 2025-12-05 06:04:45.247 186333 DEBUG oslo_concurrency.lockutils [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.261s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:04:45 compute-0 nova_compute[186329]: 2025-12-05 06:04:45.247 186333 DEBUG nova.service [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Dec 05 06:04:45 compute-0 sudo[187459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmazpriawqzdrqgrwfznctmabajjboka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914685.2121246-90-17234874406528/AnsiballZ_systemd_service.py'
Dec 05 06:04:45 compute-0 sudo[187459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:45 compute-0 python3.9[187461]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:04:45 compute-0 sudo[187459]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:46 compute-0 sudo[187612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arkmzsafbwmoyofzbjwjrmbhehskzjgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914686.1054387-110-214728021584146/AnsiballZ_file.py'
Dec 05 06:04:46 compute-0 sudo[187612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:46 compute-0 python3.9[187614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:46 compute-0 sudo[187612]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:46 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 06:04:46 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 06:04:47 compute-0 sudo[187765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahaeijonuzskgsklkmjpykpiqougrffe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914686.846823-126-193439372269178/AnsiballZ_file.py'
Dec 05 06:04:47 compute-0 sudo[187765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:47 compute-0 nova_compute[186329]: 2025-12-05 06:04:47.114 186333 DEBUG nova.service [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Dec 05 06:04:47 compute-0 nova_compute[186329]: 2025-12-05 06:04:47.114 186333 DEBUG nova.servicegroup.drivers.db [None req-1d122446-0d12-4d3c-80a8-1a2f7c8c6299 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Dec 05 06:04:47 compute-0 python3.9[187767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:47 compute-0 sudo[187765]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:47 compute-0 sudo[187917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsanxxnenfuabqprjzkrpdouskzuoifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914687.418804-144-260961337768095/AnsiballZ_command.py'
Dec 05 06:04:47 compute-0 sudo[187917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:47 compute-0 python3.9[187919]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:47 compute-0 sudo[187917]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:48 compute-0 python3.9[188071]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 06:04:48 compute-0 sudo[188221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbvkzmyzyyrsqjthzlxwbwhffomjpcos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914688.6765044-180-186501559348250/AnsiballZ_systemd_service.py'
Dec 05 06:04:48 compute-0 sudo[188221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:49 compute-0 python3.9[188223]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:04:49 compute-0 systemd[1]: Reloading.
Dec 05 06:04:49 compute-0 systemd-rc-local-generator[188243]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:04:49 compute-0 systemd-sysv-generator[188246]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:04:49 compute-0 sudo[188221]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:49 compute-0 sudo[188407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueyafuxiwasyozayjompflkjjegkhqyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914689.4770656-196-45458888859647/AnsiballZ_command.py'
Dec 05 06:04:49 compute-0 sudo[188407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:49 compute-0 python3.9[188409]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:04:49 compute-0 sudo[188407]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:50 compute-0 sudo[188560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adymuwxgnntcslwbkbnmxwlsawmbbynv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914690.0297718-214-242815064517399/AnsiballZ_file.py'
Dec 05 06:04:50 compute-0 sudo[188560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:50 compute-0 python3.9[188562]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:50 compute-0 sudo[188560]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:50 compute-0 python3.9[188712]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:51 compute-0 python3.9[188864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:51 compute-0 python3.9[188985]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914691.0987537-246-108981008634022/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:04:52 compute-0 nova_compute[186329]: 2025-12-05 06:04:52.116 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:04:52 compute-0 sudo[189135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbsxwjatxwdzskhhsvmrhqgefujwlcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914692.0546944-276-38642925195469/AnsiballZ_group.py'
Dec 05 06:04:52 compute-0 sudo[189135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:52 compute-0 python3.9[189137]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 05 06:04:52 compute-0 sudo[189135]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:52 compute-0 nova_compute[186329]: 2025-12-05 06:04:52.643 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:04:53 compute-0 sudo[189287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpxrsrbsfazrqwzzkzbvaexlvxmazhyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914692.791361-298-248450929250864/AnsiballZ_getent.py'
Dec 05 06:04:53 compute-0 sudo[189287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:53 compute-0 python3.9[189289]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 05 06:04:53 compute-0 sudo[189287]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:53 compute-0 sudo[189440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlcbvsddbqbfiqruydkgvxddweqgvyqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914693.4033153-314-178406075541734/AnsiballZ_group.py'
Dec 05 06:04:53 compute-0 sudo[189440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:53 compute-0 python3.9[189442]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 05 06:04:53 compute-0 groupadd[189443]: group added to /etc/group: name=ceilometer, GID=42405
Dec 05 06:04:53 compute-0 groupadd[189443]: group added to /etc/gshadow: name=ceilometer
Dec 05 06:04:53 compute-0 groupadd[189443]: new group: name=ceilometer, GID=42405
Dec 05 06:04:53 compute-0 sudo[189440]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:54 compute-0 sudo[189598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokdgerzijddkdeuunvhhqdrkjbancmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914693.9026134-330-270867714333228/AnsiballZ_user.py'
Dec 05 06:04:54 compute-0 sudo[189598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:04:54 compute-0 python3.9[189600]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 05 06:04:54 compute-0 useradd[189616]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 05 06:04:54 compute-0 useradd[189616]: add 'ceilometer' to group 'libvirt'
Dec 05 06:04:54 compute-0 useradd[189616]: add 'ceilometer' to shadow group 'libvirt'
Dec 05 06:04:54 compute-0 podman[189601]: 2025-12-05 06:04:54.472050659 +0000 UTC m=+0.058796718 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:04:54 compute-0 sudo[189598]: pam_unix(sudo:session): session closed for user root
Dec 05 06:04:55 compute-0 python3.9[189781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:55 compute-0 python3.9[189902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764914695.1319566-382-64300272413933/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:56 compute-0 python3.9[190052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:56 compute-0 python3.9[190173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764914695.9966955-382-188956477997789/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:57 compute-0 python3.9[190323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:57 compute-0 python3.9[190444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764914696.7987592-382-240540936126163/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:57 compute-0 python3.9[190594]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:58 compute-0 python3.9[190746]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:04:58 compute-0 python3.9[190898]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:59 compute-0 python3.9[191019]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914698.5310664-500-253776671463452/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:04:59 compute-0 python3.9[191169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:04:59 compute-0 python3.9[191245]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:00 compute-0 python3.9[191395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:00 compute-0 python3.9[191516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914700.0802536-500-107349941401790/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=8b78133c57f17951ab2e52e9d318c132bb1ce6c8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:01 compute-0 python3.9[191666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:01 compute-0 python3.9[191787]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914700.9913592-500-34599889753860/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:02 compute-0 python3.9[191937]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:02 compute-0 python3.9[192058]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914701.790746-500-28171663961806/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:02 compute-0 podman[192059]: 2025-12-05 06:05:02.555419411 +0000 UTC m=+0.040532484 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:05:02 compute-0 podman[192199]: 2025-12-05 06:05:02.835449784 +0000 UTC m=+0.064294800 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:05:02 compute-0 python3.9[192235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:03 compute-0 python3.9[192363]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914702.6121633-500-13282508972529/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:03 compute-0 python3.9[192513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:04 compute-0 python3.9[192634]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914703.4213228-500-68789845170689/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:04 compute-0 python3.9[192784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:04 compute-0 python3.9[192905]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914704.2184596-500-252094310996732/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:05 compute-0 python3.9[193055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:05 compute-0 python3.9[193176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914705.1804256-500-281471550311691/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:06 compute-0 python3.9[193326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:06 compute-0 python3.9[193447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914706.0516732-500-141185165041300/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:07 compute-0 python3.9[193597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:07 compute-0 python3.9[193718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914706.8565176-500-108950050252816/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:08 compute-0 python3.9[193868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:08 compute-0 python3.9[193944]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:08 compute-0 python3.9[194094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:09 compute-0 python3.9[194170]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:09 compute-0 python3.9[194320]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:10 compute-0 python3.9[194396]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:10 compute-0 sudo[194546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tubzmqlvdnpvitqhcfmvoaqjjbhtpkkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914710.3846645-878-247060619904811/AnsiballZ_file.py'
Dec 05 06:05:10 compute-0 sudo[194546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:10 compute-0 python3.9[194548]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:10 compute-0 sudo[194546]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:11 compute-0 sudo[194698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpgsjhareoxoxgnhzlrcutohcobkhrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914710.8586466-894-168159023804933/AnsiballZ_file.py'
Dec 05 06:05:11 compute-0 sudo[194698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:11 compute-0 python3.9[194700]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:11 compute-0 sudo[194698]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:11 compute-0 sudo[194850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdjitpzrctamukhxqahnlravsnamnywn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914711.3480663-910-49180937238572/AnsiballZ_file.py'
Dec 05 06:05:11 compute-0 sudo[194850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:11 compute-0 python3.9[194852]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:05:11 compute-0 sudo[194850]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:11 compute-0 sudo[195002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkjdxwgqlrusoowsaxfnpnepssbiekpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914711.8215327-926-23290866253998/AnsiballZ_systemd_service.py'
Dec 05 06:05:11 compute-0 sudo[195002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:12 compute-0 python3.9[195004]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:05:12 compute-0 systemd[1]: Reloading.
Dec 05 06:05:12 compute-0 systemd-sysv-generator[195032]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:05:12 compute-0 systemd-rc-local-generator[195029]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:05:12 compute-0 systemd[1]: Listening on Podman API Socket.
Dec 05 06:05:12 compute-0 sudo[195002]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:12 compute-0 sudo[195192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzhyussmhmnjfhsjvbhporvdhwwqqtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914712.7968807-944-96461820432831/AnsiballZ_stat.py'
Dec 05 06:05:12 compute-0 sudo[195192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:13 compute-0 python3.9[195194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:13 compute-0 sudo[195192]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:13 compute-0 sudo[195315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yadilpmhpyozfmokgnkljhqrdncetrmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914712.7968807-944-96461820432831/AnsiballZ_copy.py'
Dec 05 06:05:13 compute-0 sudo[195315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:13 compute-0 python3.9[195317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914712.7968807-944-96461820432831/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:05:13 compute-0 sudo[195315]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:14 compute-0 sudo[195467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfmsuckcpxjiqfdylydgxbtviecntcnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914713.7827823-978-257720751538217/AnsiballZ_container_config_data.py'
Dec 05 06:05:14 compute-0 sudo[195467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:14 compute-0 python3.9[195469]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 05 06:05:14 compute-0 sudo[195467]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:14 compute-0 sudo[195619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkjuraufcbglmirdotsggcqxbxugtqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914714.5426803-996-102033881120643/AnsiballZ_container_config_hash.py'
Dec 05 06:05:14 compute-0 sudo[195619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:15 compute-0 python3.9[195621]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 06:05:15 compute-0 sudo[195619]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:15 compute-0 sudo[195771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvipzarfpengbypuhpydlcbfiqgwbjpp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914715.3535144-1016-22616229047489/AnsiballZ_edpm_container_manage.py'
Dec 05 06:05:15 compute-0 sudo[195771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:15 compute-0 python3[195773]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 06:05:19 compute-0 podman[195785]: 2025-12-05 06:05:19.392575339 +0000 UTC m=+3.423879328 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 05 06:05:19 compute-0 podman[195867]: 2025-12-05 06:05:19.475903827 +0000 UTC m=+0.025683559 container create f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:05:19 compute-0 podman[195867]: 2025-12-05 06:05:19.463865476 +0000 UTC m=+0.013645228 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Dec 05 06:05:19 compute-0 python3[195773]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 05 06:05:19 compute-0 sudo[195771]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:19 compute-0 sudo[196043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-walabrqoycmfypmrvnztlnkpggjhqpof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914719.68868-1032-192179279527281/AnsiballZ_stat.py'
Dec 05 06:05:19 compute-0 sudo[196043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:20 compute-0 python3.9[196045]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:05:20 compute-0 sudo[196043]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:20 compute-0 sudo[196197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krrxfpgltrkuhkupejeyuzxmdnlonncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914720.2267237-1050-274109882289789/AnsiballZ_file.py'
Dec 05 06:05:20 compute-0 sudo[196197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:20 compute-0 python3.9[196199]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:20 compute-0 sudo[196197]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:20 compute-0 sudo[196348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zajhtrsfwlemlzqndpvmqckvllpnlixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914720.5996082-1050-109015906556404/AnsiballZ_copy.py'
Dec 05 06:05:20 compute-0 sudo[196348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:21 compute-0 python3.9[196350]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914720.5996082-1050-109015906556404/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:21 compute-0 sudo[196348]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:21 compute-0 sudo[196424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esflwuwkursfnrsvgfnhjxcewmfkaeds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914720.5996082-1050-109015906556404/AnsiballZ_systemd.py'
Dec 05 06:05:21 compute-0 sudo[196424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:21 compute-0 python3.9[196426]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:05:21 compute-0 systemd[1]: Reloading.
Dec 05 06:05:21 compute-0 systemd-sysv-generator[196451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:05:21 compute-0 systemd-rc-local-generator[196447]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:05:21 compute-0 sudo[196424]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:22 compute-0 sudo[196535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxlbzhzbsnlcbezutssplzofwpwthzpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914720.5996082-1050-109015906556404/AnsiballZ_systemd.py'
Dec 05 06:05:22 compute-0 sudo[196535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:22 compute-0 python3.9[196537]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:05:22 compute-0 systemd[1]: Reloading.
Dec 05 06:05:22 compute-0 systemd-sysv-generator[196565]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:05:22 compute-0 systemd-rc-local-generator[196562]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:05:22 compute-0 systemd[1]: Starting podman_exporter container...
Dec 05 06:05:22 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:05:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1da46cda420f72f2e48113c14a7b0f860cd9e94b5537e717f1f9cfc85e35902/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1da46cda420f72f2e48113c14a7b0f860cd9e94b5537e717f1f9cfc85e35902/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.
Dec 05 06:05:22 compute-0 podman[196576]: 2025-12-05 06:05:22.715501333 +0000 UTC m=+0.081052966 container init f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.727Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.727Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.727Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.727Z caller=handler.go:105 level=info collector=container
Dec 05 06:05:22 compute-0 podman[196576]: 2025-12-05 06:05:22.73834624 +0000 UTC m=+0.103897874 container start f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:22 compute-0 podman[196576]: podman_exporter
Dec 05 06:05:22 compute-0 systemd[1]: Starting Podman API Service...
Dec 05 06:05:22 compute-0 systemd[1]: Started podman_exporter container.
Dec 05 06:05:22 compute-0 systemd[1]: Started Podman API Service.
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="Setting parallel job count to 13"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="Using sqlite as database backend"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 05 06:05:22 compute-0 sudo[196535]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:22 compute-0 podman[196599]: @ - - [05/Dec/2025:06:05:22 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 05 06:05:22 compute-0 podman[196599]: time="2025-12-05T06:05:22Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:05:22 compute-0 podman[196598]: 2025-12-05 06:05:22.790927855 +0000 UTC m=+0.046157115 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:22 compute-0 podman[196599]: @ - - [05/Dec/2025:06:05:22 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14218 "" "Go-http-client/1.1"
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.794Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 05 06:05:22 compute-0 systemd[1]: f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca-6871e1e11ef5ad8b.service: Main process exited, code=exited, status=1/FAILURE
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.794Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 05 06:05:22 compute-0 systemd[1]: f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca-6871e1e11ef5ad8b.service: Failed with result 'exit-code'.
Dec 05 06:05:22 compute-0 podman_exporter[196588]: ts=2025-12-05T06:05:22.795Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 05 06:05:23 compute-0 sudo[196779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvcuwvkodawwdisvbszzpdnljvelebxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914722.8847811-1098-249443310781868/AnsiballZ_systemd.py'
Dec 05 06:05:23 compute-0 sudo[196779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:23 compute-0 python3.9[196781]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:05:23 compute-0 systemd[1]: Stopping podman_exporter container...
Dec 05 06:05:23 compute-0 podman[196599]: @ - - [05/Dec/2025:06:05:22 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Dec 05 06:05:23 compute-0 systemd[1]: libpod-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.scope: Deactivated successfully.
Dec 05 06:05:23 compute-0 podman[196785]: 2025-12-05 06:05:23.392707211 +0000 UTC m=+0.035049081 container died f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:05:23 compute-0 systemd[1]: f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca-6871e1e11ef5ad8b.timer: Deactivated successfully.
Dec 05 06:05:23 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.
Dec 05 06:05:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca-userdata-shm.mount: Deactivated successfully.
Dec 05 06:05:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-e1da46cda420f72f2e48113c14a7b0f860cd9e94b5537e717f1f9cfc85e35902-merged.mount: Deactivated successfully.
Dec 05 06:05:23 compute-0 podman[196785]: 2025-12-05 06:05:23.519274412 +0000 UTC m=+0.161616283 container cleanup f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:23 compute-0 podman[196785]: podman_exporter
Dec 05 06:05:23 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 05 06:05:23 compute-0 podman[196807]: podman_exporter
Dec 05 06:05:23 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 05 06:05:23 compute-0 systemd[1]: Stopped podman_exporter container.
Dec 05 06:05:23 compute-0 systemd[1]: Starting podman_exporter container...
Dec 05 06:05:23 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:05:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1da46cda420f72f2e48113c14a7b0f860cd9e94b5537e717f1f9cfc85e35902/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1da46cda420f72f2e48113c14a7b0f860cd9e94b5537e717f1f9cfc85e35902/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.
Dec 05 06:05:23 compute-0 podman[196817]: 2025-12-05 06:05:23.649965646 +0000 UTC m=+0.071404904 container init f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.659Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.659Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.659Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.659Z caller=handler.go:105 level=info collector=container
Dec 05 06:05:23 compute-0 podman[196599]: @ - - [05/Dec/2025:06:05:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 05 06:05:23 compute-0 podman[196599]: time="2025-12-05T06:05:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:05:23 compute-0 podman[196817]: 2025-12-05 06:05:23.66964413 +0000 UTC m=+0.091083368 container start f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:05:23 compute-0 podman[196817]: podman_exporter
Dec 05 06:05:23 compute-0 podman[196599]: @ - - [05/Dec/2025:06:05:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14220 "" "Go-http-client/1.1"
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.672Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.673Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 05 06:05:23 compute-0 podman_exporter[196829]: ts=2025-12-05T06:05:23.673Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 05 06:05:23 compute-0 systemd[1]: Started podman_exporter container.
Dec 05 06:05:23 compute-0 sudo[196779]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:23 compute-0 podman[196840]: 2025-12-05 06:05:23.715486463 +0000 UTC m=+0.038356700 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:24 compute-0 sudo[197011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzrxqthyxwgbvtyktdyfdycbyhufxirl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914723.884846-1114-77382491957674/AnsiballZ_stat.py'
Dec 05 06:05:24 compute-0 sudo[197011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:24 compute-0 python3.9[197013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:24 compute-0 sudo[197011]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:24 compute-0 sudo[197134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldkjmlcdodogobamzmbvcnotqsrdtcfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914723.884846-1114-77382491957674/AnsiballZ_copy.py'
Dec 05 06:05:24 compute-0 sudo[197134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:24 compute-0 python3.9[197136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764914723.884846-1114-77382491957674/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 05 06:05:24 compute-0 sudo[197134]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:24 compute-0 podman[197137]: 2025-12-05 06:05:24.705147742 +0000 UTC m=+0.081145190 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 05 06:05:25 compute-0 sudo[197309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfmlhmhsksbvkgsjeopmilnsqevvytn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914724.8763485-1148-199196252234289/AnsiballZ_container_config_data.py'
Dec 05 06:05:25 compute-0 sudo[197309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:25 compute-0 python3.9[197311]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 05 06:05:25 compute-0 sudo[197309]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:25 compute-0 sudo[197461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icvcuazikyexxhwxbpblrblsphvtlqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914725.4075222-1166-262108049979722/AnsiballZ_container_config_hash.py'
Dec 05 06:05:25 compute-0 sudo[197461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:25 compute-0 python3.9[197463]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 05 06:05:25 compute-0 sudo[197461]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:26 compute-0 sudo[197613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beknohpfijsdhtkzwqxhfcalrdiqnnji ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914725.99424-1186-117660568694708/AnsiballZ_edpm_container_manage.py'
Dec 05 06:05:26 compute-0 sudo[197613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:26 compute-0 python3[197615]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 05 06:05:28 compute-0 podman[197625]: 2025-12-05 06:05:28.830614091 +0000 UTC m=+2.406918920 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 05 06:05:28 compute-0 podman[197702]: 2025-12-05 06:05:28.921238706 +0000 UTC m=+0.027081544 container create 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:05:28 compute-0 podman[197702]: 2025-12-05 06:05:28.90871836 +0000 UTC m=+0.014561208 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 05 06:05:28 compute-0 python3[197615]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Dec 05 06:05:29 compute-0 sudo[197613]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:29 compute-0 sudo[197878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmrxtfczesqjftsjhdoxgjujezdpdlju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914729.1229746-1202-208355556238099/AnsiballZ_stat.py'
Dec 05 06:05:29 compute-0 sudo[197878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:29 compute-0 python3.9[197880]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:05:29 compute-0 sudo[197878]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:05:29.484 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:05:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:05:29.485 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:05:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:05:29.485 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:05:29 compute-0 sudo[198033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsmsrdnkzvpfqgbbkfcbryeqetobkfuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914729.677794-1220-83731037879394/AnsiballZ_file.py'
Dec 05 06:05:29 compute-0 sudo[198033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:30 compute-0 python3.9[198035]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:30 compute-0 sudo[198033]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:30 compute-0 sudo[198184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzhibamrszelbrgqjlzfqjtxuipnbglc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914730.0562205-1220-100288961091933/AnsiballZ_copy.py'
Dec 05 06:05:30 compute-0 sudo[198184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:30 compute-0 python3.9[198186]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764914730.0562205-1220-100288961091933/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:30 compute-0 sudo[198184]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:30 compute-0 sudo[198260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgktilkxpiwkulqkviofjxfuosgqqscm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914730.0562205-1220-100288961091933/AnsiballZ_systemd.py'
Dec 05 06:05:30 compute-0 sudo[198260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:30 compute-0 python3.9[198262]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 05 06:05:30 compute-0 systemd[1]: Reloading.
Dec 05 06:05:30 compute-0 systemd-sysv-generator[198286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:05:30 compute-0 systemd-rc-local-generator[198282]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:05:31 compute-0 sudo[198260]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:31 compute-0 sudo[198371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxsnxlkgfflohxraftuwebfqedheotvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914730.0562205-1220-100288961091933/AnsiballZ_systemd.py'
Dec 05 06:05:31 compute-0 sudo[198371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:31 compute-0 python3.9[198373]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 05 06:05:31 compute-0 systemd[1]: Reloading.
Dec 05 06:05:31 compute-0 systemd-sysv-generator[198399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 05 06:05:31 compute-0 systemd-rc-local-generator[198396]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 05 06:05:31 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 05 06:05:31 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:05:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:31 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.
Dec 05 06:05:31 compute-0 podman[198413]: 2025-12-05 06:05:31.884551446 +0000 UTC m=+0.077724448 container init 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *bridge.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *coverage.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *datapath.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *iface.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *memory.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *ovnnorthd.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *ovn.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *ovsdbserver.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *pmd_perf.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *pmd_rxq.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: INFO    06:05:31 main.go:48: registering *vswitch.Collector
Dec 05 06:05:31 compute-0 openstack_network_exporter[198425]: NOTICE  06:05:31 main.go:76: listening on https://:9105/metrics
Dec 05 06:05:31 compute-0 podman[198413]: 2025-12-05 06:05:31.905129378 +0000 UTC m=+0.098302370 container start 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal)
Dec 05 06:05:31 compute-0 podman[198413]: openstack_network_exporter
Dec 05 06:05:31 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 05 06:05:31 compute-0 sudo[198371]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:31 compute-0 podman[198435]: 2025-12-05 06:05:31.96676493 +0000 UTC m=+0.054231513 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 06:05:32 compute-0 sudo[198604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwpsqixstruqbpgwhzrsavrlzglvmhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914732.1414804-1268-109472627457441/AnsiballZ_systemd.py'
Dec 05 06:05:32 compute-0 sudo[198604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:32 compute-0 python3.9[198606]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 05 06:05:32 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Dec 05 06:05:32 compute-0 podman[198608]: 2025-12-05 06:05:32.657395712 +0000 UTC m=+0.046310062 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 06:05:32 compute-0 systemd[1]: libpod-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.scope: Deactivated successfully.
Dec 05 06:05:32 compute-0 podman[198617]: 2025-12-05 06:05:32.663549043 +0000 UTC m=+0.032620290 container died 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git)
Dec 05 06:05:32 compute-0 systemd[1]: 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454-68fbcd4479139750.timer: Deactivated successfully.
Dec 05 06:05:32 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.
Dec 05 06:05:32 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454-userdata-shm.mount: Deactivated successfully.
Dec 05 06:05:32 compute-0 systemd[1]: var-lib-containers-storage-overlay-9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f-merged.mount: Deactivated successfully.
Dec 05 06:05:33 compute-0 podman[198617]: 2025-12-05 06:05:33.236375925 +0000 UTC m=+0.605447172 container cleanup 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec 05 06:05:33 compute-0 podman[198617]: openstack_network_exporter
Dec 05 06:05:33 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 05 06:05:33 compute-0 podman[198651]: openstack_network_exporter
Dec 05 06:05:33 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 05 06:05:33 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Dec 05 06:05:33 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 05 06:05:33 compute-0 podman[198650]: 2025-12-05 06:05:33.322434718 +0000 UTC m=+0.069464751 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:05:33 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:05:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9322867cd3dccb6d1cd7d1d31425ead542ec15cef6561e9a58ec6d12633fa96f/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 05 06:05:33 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.
Dec 05 06:05:33 compute-0 podman[198669]: 2025-12-05 06:05:33.368694093 +0000 UTC m=+0.074884355 container init 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *bridge.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *coverage.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *datapath.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *iface.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *memory.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *ovnnorthd.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *ovn.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *ovsdbserver.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *pmd_perf.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *pmd_rxq.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: INFO    06:05:33 main.go:48: registering *vswitch.Collector
Dec 05 06:05:33 compute-0 openstack_network_exporter[198686]: NOTICE  06:05:33 main.go:76: listening on https://:9105/metrics
Dec 05 06:05:33 compute-0 podman[198669]: 2025-12-05 06:05:33.387356218 +0000 UTC m=+0.093546490 container start 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Dec 05 06:05:33 compute-0 podman[198669]: openstack_network_exporter
Dec 05 06:05:33 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 05 06:05:33 compute-0 sudo[198604]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:33 compute-0 podman[198696]: 2025-12-05 06:05:33.434340795 +0000 UTC m=+0.040891729 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:05:33 compute-0 sudo[198863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhftvogjdfvykoldkqxnnmqqbdnpymci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914733.531918-1284-128619557541500/AnsiballZ_find.py'
Dec 05 06:05:33 compute-0 sudo[198863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:33 compute-0 python3.9[198865]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 05 06:05:33 compute-0 sudo[198863]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:34 compute-0 sudo[199015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkohzqoqmdkucjndhnzzzkcnofntichf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914734.2081134-1303-184700726770139/AnsiballZ_podman_container_info.py'
Dec 05 06:05:34 compute-0 sudo[199015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:34 compute-0 python3.9[199017]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.711 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.711 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:05:34 compute-0 nova_compute[186329]: 2025-12-05 06:05:34.711 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:05:34 compute-0 sudo[199015]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:35 compute-0 sudo[199177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swfxmesdgrichibjqkncyumeqofvsxwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914734.856005-1311-49709265428009/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:35 compute-0 sudo[199177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.218 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.393 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.394 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:05:35 compute-0 python3.9[199179]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.408 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.408 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6029MB free_disk=73.20468139648438GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.409 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:05:35 compute-0 nova_compute[186329]: 2025-12-05 06:05:35.409 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:05:35 compute-0 systemd[1]: Started libpod-conmon-9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6.scope.
Dec 05 06:05:35 compute-0 podman[199181]: 2025-12-05 06:05:35.466988404 +0000 UTC m=+0.050960642 container exec 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:05:35 compute-0 podman[199181]: 2025-12-05 06:05:35.469915059 +0000 UTC m=+0.053887297 container exec_died 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:05:35 compute-0 sudo[199177]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:35 compute-0 systemd[1]: libpod-conmon-9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6.scope: Deactivated successfully.
Dec 05 06:05:35 compute-0 sudo[199356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dquktwcnhiqtevlavjjdfpdjsvgvfzjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914735.616231-1319-68819298613595/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:35 compute-0 sudo[199356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:35 compute-0 python3.9[199358]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:36 compute-0 systemd[1]: Started libpod-conmon-9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6.scope.
Dec 05 06:05:36 compute-0 podman[199359]: 2025-12-05 06:05:36.016459809 +0000 UTC m=+0.047304489 container exec 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true)
Dec 05 06:05:36 compute-0 podman[199375]: 2025-12-05 06:05:36.070992295 +0000 UTC m=+0.044652600 container exec_died 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:05:36 compute-0 podman[199359]: 2025-12-05 06:05:36.075391204 +0000 UTC m=+0.106235884 container exec_died 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 05 06:05:36 compute-0 systemd[1]: libpod-conmon-9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6.scope: Deactivated successfully.
Dec 05 06:05:36 compute-0 sudo[199356]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:36 compute-0 sudo[199534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emietxlhekxyhwgvmoedwvovvuffroub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914736.226831-1327-76696849458245/AnsiballZ_file.py'
Dec 05 06:05:36 compute-0 sudo[199534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:36 compute-0 nova_compute[186329]: 2025-12-05 06:05:36.559 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:05:36 compute-0 nova_compute[186329]: 2025-12-05 06:05:36.560 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:05:35 up 43 min,  0 user,  load average: 0.98, 0.81, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:05:36 compute-0 python3.9[199536]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:36 compute-0 nova_compute[186329]: 2025-12-05 06:05:36.579 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:05:36 compute-0 sudo[199534]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:36 compute-0 sudo[199686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooahoybhutzjvmnvdrbtrsyoknjijeoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914736.7904906-1336-31514071187664/AnsiballZ_podman_container_info.py'
Dec 05 06:05:36 compute-0 sudo[199686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:37 compute-0 nova_compute[186329]: 2025-12-05 06:05:37.084 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:05:37 compute-0 python3.9[199688]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 05 06:05:37 compute-0 sudo[199686]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:37 compute-0 sudo[199847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llqiduvkthhwenyykfawekraavjpaflv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914737.298929-1344-227015173092715/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:37 compute-0 sudo[199847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:37 compute-0 nova_compute[186329]: 2025-12-05 06:05:37.589 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:05:37 compute-0 nova_compute[186329]: 2025-12-05 06:05:37.589 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.180s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:05:37 compute-0 python3.9[199849]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:37 compute-0 systemd[1]: Started libpod-conmon-09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22.scope.
Dec 05 06:05:37 compute-0 podman[199850]: 2025-12-05 06:05:37.709091364 +0000 UTC m=+0.045400616 container exec 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:05:37 compute-0 podman[199868]: 2025-12-05 06:05:37.764931556 +0000 UTC m=+0.046353804 container exec_died 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:05:37 compute-0 podman[199850]: 2025-12-05 06:05:37.767896803 +0000 UTC m=+0.104206055 container exec_died 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:05:37 compute-0 systemd[1]: libpod-conmon-09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22.scope: Deactivated successfully.
Dec 05 06:05:37 compute-0 sudo[199847]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:38 compute-0 sudo[200027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gejtfjqdcgkhkcynpqvuklliuymutfjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914737.9249377-1352-6620923587598/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:38 compute-0 sudo[200027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:38 compute-0 python3.9[200029]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:38 compute-0 systemd[1]: Started libpod-conmon-09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22.scope.
Dec 05 06:05:38 compute-0 podman[200030]: 2025-12-05 06:05:38.324330258 +0000 UTC m=+0.040847697 container exec 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:05:38 compute-0 podman[200046]: 2025-12-05 06:05:38.377950933 +0000 UTC m=+0.044057733 container exec_died 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 06:05:38 compute-0 podman[200030]: 2025-12-05 06:05:38.381778509 +0000 UTC m=+0.098295947 container exec_died 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:05:38 compute-0 systemd[1]: libpod-conmon-09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22.scope: Deactivated successfully.
Dec 05 06:05:38 compute-0 sudo[200027]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:38 compute-0 sudo[200205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rraqccaqqmeebnobpfgwsgsmjrgbjflu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914738.5319192-1360-119779664200207/AnsiballZ_file.py'
Dec 05 06:05:38 compute-0 sudo[200205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:38 compute-0 python3.9[200207]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:38 compute-0 sudo[200205]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:39 compute-0 sudo[200357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwsgdjgxopwnnyqercfaacivyoradbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914739.0516999-1369-249910464969733/AnsiballZ_podman_container_info.py'
Dec 05 06:05:39 compute-0 sudo[200357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:39 compute-0 python3.9[200359]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 05 06:05:39 compute-0 sudo[200357]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:39 compute-0 sudo[200519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvlsiwlpnfjdijufzwarpebsxqwtdxty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914739.572704-1377-154726686411999/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:39 compute-0 sudo[200519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:39 compute-0 python3.9[200521]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:39 compute-0 systemd[1]: Started libpod-conmon-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.scope.
Dec 05 06:05:39 compute-0 podman[200522]: 2025-12-05 06:05:39.965145446 +0000 UTC m=+0.048586175 container exec 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 05 06:05:40 compute-0 podman[200538]: 2025-12-05 06:05:40.017955259 +0000 UTC m=+0.043605544 container exec_died 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Dec 05 06:05:40 compute-0 podman[200522]: 2025-12-05 06:05:40.020961432 +0000 UTC m=+0.104402171 container exec_died 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:05:40 compute-0 systemd[1]: libpod-conmon-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.scope: Deactivated successfully.
Dec 05 06:05:40 compute-0 sudo[200519]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:40 compute-0 sudo[200697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbuhmzonbrkvzldwgrmsdseuvvazentl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914740.1595085-1385-77542098392412/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:40 compute-0 sudo[200697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:40 compute-0 python3.9[200699]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:40 compute-0 systemd[1]: Started libpod-conmon-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.scope.
Dec 05 06:05:40 compute-0 podman[200700]: 2025-12-05 06:05:40.563910112 +0000 UTC m=+0.043161701 container exec 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:05:40 compute-0 podman[200716]: 2025-12-05 06:05:40.615920755 +0000 UTC m=+0.043362738 container exec_died 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 05 06:05:40 compute-0 podman[200700]: 2025-12-05 06:05:40.620019249 +0000 UTC m=+0.099270818 container exec_died 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:05:40 compute-0 systemd[1]: libpod-conmon-836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0.scope: Deactivated successfully.
Dec 05 06:05:40 compute-0 sudo[200697]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:40 compute-0 sudo[200875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xavybqhoqsanvzzqeledwczaaeyhmxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914740.7664638-1393-57514874690998/AnsiballZ_file.py'
Dec 05 06:05:40 compute-0 sudo[200875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:41 compute-0 python3.9[200877]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:41 compute-0 sudo[200875]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:41 compute-0 sudo[201027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thqoncxdrnsklucgswvabdbnaytpxqsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914741.4114797-1402-162864068657810/AnsiballZ_podman_container_info.py'
Dec 05 06:05:41 compute-0 sudo[201027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:41 compute-0 python3.9[201029]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 05 06:05:41 compute-0 sudo[201027]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:42 compute-0 sudo[201189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baxpnlcbqtjyhnljerituupuheqxaxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914741.9245467-1410-235227175613153/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:42 compute-0 sudo[201189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:42 compute-0 python3.9[201191]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:42 compute-0 systemd[1]: Started libpod-conmon-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.scope.
Dec 05 06:05:42 compute-0 podman[201192]: 2025-12-05 06:05:42.327016565 +0000 UTC m=+0.042694474 container exec f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:42 compute-0 podman[201208]: 2025-12-05 06:05:42.381951057 +0000 UTC m=+0.047160479 container exec_died f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:05:42 compute-0 podman[201192]: 2025-12-05 06:05:42.38440873 +0000 UTC m=+0.100086640 container exec_died f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:42 compute-0 systemd[1]: libpod-conmon-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.scope: Deactivated successfully.
Dec 05 06:05:42 compute-0 sudo[201189]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:42 compute-0 sudo[201366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkvddzccdhuopcfdykfmminbhyeoltns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914742.5363224-1418-11916365252152/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:42 compute-0 sudo[201366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:42 compute-0 python3.9[201368]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:42 compute-0 systemd[1]: Started libpod-conmon-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.scope.
Dec 05 06:05:42 compute-0 podman[201369]: 2025-12-05 06:05:42.930579088 +0000 UTC m=+0.040205912 container exec f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:05:42 compute-0 podman[201384]: 2025-12-05 06:05:42.982934648 +0000 UTC m=+0.044670472 container exec_died f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:05:42 compute-0 podman[201369]: 2025-12-05 06:05:42.985094532 +0000 UTC m=+0.094721347 container exec_died f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:05:42 compute-0 systemd[1]: libpod-conmon-f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca.scope: Deactivated successfully.
Dec 05 06:05:43 compute-0 sudo[201366]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:43 compute-0 sudo[201544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watkswkmgnjpkhzoeekrwtgyvzzeffwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914743.1316426-1426-123611836774459/AnsiballZ_file.py'
Dec 05 06:05:43 compute-0 sudo[201544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:43 compute-0 python3.9[201546]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:43 compute-0 sudo[201544]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:43 compute-0 sudo[201696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdfxpcanzywhxzudpqeudouxwesrxkkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914743.6347928-1435-186037527234970/AnsiballZ_podman_container_info.py'
Dec 05 06:05:43 compute-0 sudo[201696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:43 compute-0 python3.9[201698]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 05 06:05:44 compute-0 sudo[201696]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:44 compute-0 sudo[201858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jniygnykrxjwohdgsnkmfngowawzfchl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914744.1372812-1443-236941213964195/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:44 compute-0 sudo[201858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:44 compute-0 python3.9[201860]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:44 compute-0 systemd[1]: Started libpod-conmon-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.scope.
Dec 05 06:05:44 compute-0 podman[201861]: 2025-12-05 06:05:44.517859852 +0000 UTC m=+0.041178558 container exec 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Dec 05 06:05:44 compute-0 podman[201877]: 2025-12-05 06:05:44.567996257 +0000 UTC m=+0.042045786 container exec_died 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Dec 05 06:05:44 compute-0 podman[201861]: 2025-12-05 06:05:44.571359731 +0000 UTC m=+0.094678436 container exec_died 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.tags=minimal rhel9, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 06:05:44 compute-0 systemd[1]: libpod-conmon-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.scope: Deactivated successfully.
Dec 05 06:05:44 compute-0 sudo[201858]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:44 compute-0 auditd[672]: Audit daemon rotating log files
Dec 05 06:05:44 compute-0 sudo[202037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpjrssutjfkxtszwfplmbateiibqxypk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914744.7238996-1451-258403466076489/AnsiballZ_podman_container_exec.py'
Dec 05 06:05:44 compute-0 sudo[202037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:45 compute-0 python3.9[202039]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 05 06:05:45 compute-0 systemd[1]: Started libpod-conmon-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.scope.
Dec 05 06:05:45 compute-0 podman[202040]: 2025-12-05 06:05:45.140165103 +0000 UTC m=+0.055861704 container exec 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Dec 05 06:05:45 compute-0 podman[202040]: 2025-12-05 06:05:45.146967593 +0000 UTC m=+0.062664194 container exec_died 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41)
Dec 05 06:05:45 compute-0 sudo[202037]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:45 compute-0 systemd[1]: libpod-conmon-186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454.scope: Deactivated successfully.
Dec 05 06:05:45 compute-0 sudo[202215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfnszvkuhbeyvzugoiuwptyhijysbkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914745.297394-1459-241102926001381/AnsiballZ_file.py'
Dec 05 06:05:45 compute-0 sudo[202215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:45 compute-0 python3.9[202217]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:45 compute-0 sudo[202215]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:54 compute-0 podman[202242]: 2025-12-05 06:05:54.451585356 +0000 UTC m=+0.036140009 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:05:55 compute-0 podman[202263]: 2025-12-05 06:05:55.471155965 +0000 UTC m=+0.053673717 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:05:56 compute-0 sudo[202411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nofqusxbsmqrjnrtwhlhmqtjbhhyuujc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914756.6504574-1634-280761504019823/AnsiballZ_file.py'
Dec 05 06:05:56 compute-0 sudo[202411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:56 compute-0 python3.9[202413]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:56 compute-0 sudo[202411]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:57 compute-0 sudo[202563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyvxrsmtivesgvxloorzkonrfrufcwjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914757.1076348-1650-206163448495278/AnsiballZ_stat.py'
Dec 05 06:05:57 compute-0 sudo[202563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:57 compute-0 python3.9[202565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:57 compute-0 sudo[202563]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:57 compute-0 sudo[202686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjunkacxdbitvtaqzigkhxvdmzqqgzez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914757.1076348-1650-206163448495278/AnsiballZ_copy.py'
Dec 05 06:05:57 compute-0 sudo[202686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:57 compute-0 python3.9[202688]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764914757.1076348-1650-206163448495278/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:57 compute-0 sudo[202686]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:58 compute-0 sudo[202838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnjzjvpwiwecjbqgdnjmsjcewtkeqjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914758.0545793-1682-13633956618834/AnsiballZ_file.py'
Dec 05 06:05:58 compute-0 sudo[202838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:58 compute-0 python3.9[202840]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:58 compute-0 sudo[202838]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:58 compute-0 sudo[202990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbccbxbkgvkhbuaegetjbgfivrpykrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914758.53272-1698-133910923399181/AnsiballZ_stat.py'
Dec 05 06:05:58 compute-0 sudo[202990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:58 compute-0 python3.9[202992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:58 compute-0 sudo[202990]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:59 compute-0 sudo[203068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfavskefatcsyyibyivyoefgrowijtes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914758.53272-1698-133910923399181/AnsiballZ_file.py'
Dec 05 06:05:59 compute-0 sudo[203068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:59 compute-0 python3.9[203070]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:59 compute-0 sudo[203068]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:59 compute-0 sudo[203220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syaktilaonzsvbmsvpumljrirfhbyezs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914759.323678-1722-35197938181464/AnsiballZ_stat.py'
Dec 05 06:05:59 compute-0 sudo[203220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:59 compute-0 python3.9[203222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:05:59 compute-0 sudo[203220]: pam_unix(sudo:session): session closed for user root
Dec 05 06:05:59 compute-0 sudo[203298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohxvmgpeuxeoyeqclevnzdgvccrmpknx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914759.323678-1722-35197938181464/AnsiballZ_file.py'
Dec 05 06:05:59 compute-0 sudo[203298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:05:59 compute-0 python3.9[203300]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.d2vw2yyx recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:05:59 compute-0 sudo[203298]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:00 compute-0 sudo[203450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inbaqutxwafnpzviecobxwmraufefzbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914760.0960827-1746-228790815446499/AnsiballZ_stat.py'
Dec 05 06:06:00 compute-0 sudo[203450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:00 compute-0 python3.9[203452]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:00 compute-0 sudo[203450]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:00 compute-0 sudo[203528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmkudjklzougsadydbsnyniasfvojvwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914760.0960827-1746-228790815446499/AnsiballZ_file.py'
Dec 05 06:06:00 compute-0 sudo[203528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:00 compute-0 python3.9[203530]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:00 compute-0 sudo[203528]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:01 compute-0 sudo[203680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-immnvapainjjuifyibxgktrqgdtbzvxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914760.9304502-1772-73114839075305/AnsiballZ_command.py'
Dec 05 06:06:01 compute-0 sudo[203680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:01 compute-0 python3.9[203682]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:06:01 compute-0 sudo[203680]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:01 compute-0 sudo[203833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bogzaiopapyslbxxlfauuvdredlxwkrb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1764914761.3992076-1788-74067022882118/AnsiballZ_edpm_nftables_from_files.py'
Dec 05 06:06:01 compute-0 sudo[203833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:01 compute-0 python3[203835]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 05 06:06:01 compute-0 sudo[203833]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:02 compute-0 sudo[203985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qatpqzqkiutfkzncrpssljceyzfidauo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914761.9820964-1804-184188525879732/AnsiballZ_stat.py'
Dec 05 06:06:02 compute-0 sudo[203985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:02 compute-0 python3.9[203987]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:02 compute-0 sudo[203985]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:02 compute-0 sudo[204063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qadgvudbdfgicctjdazhnzdedcsofybo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914761.9820964-1804-184188525879732/AnsiballZ_file.py'
Dec 05 06:06:02 compute-0 sudo[204063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:02 compute-0 python3.9[204065]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:02 compute-0 sudo[204063]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:03 compute-0 sudo[204223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcjtiueggxazbjygazmsfacycyzbjvhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914762.8253636-1828-166614402807232/AnsiballZ_stat.py'
Dec 05 06:06:03 compute-0 sudo[204223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:03 compute-0 podman[204189]: 2025-12-05 06:06:03.045613195 +0000 UTC m=+0.040508060 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0)
Dec 05 06:06:03 compute-0 python3.9[204233]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:03 compute-0 sudo[204223]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:03 compute-0 sudo[204311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmvzijuwxyxeluwoawukdmptubptywmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914762.8253636-1828-166614402807232/AnsiballZ_file.py'
Dec 05 06:06:03 compute-0 sudo[204311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:03 compute-0 podman[204313]: 2025-12-05 06:06:03.407387255 +0000 UTC m=+0.036344672 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:06:03 compute-0 python3.9[204314]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:03 compute-0 sudo[204311]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:03 compute-0 sudo[204491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ointjshpgjdhhtnlvydedyhqckevpwez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914763.6737263-1852-137896124963429/AnsiballZ_stat.py'
Dec 05 06:06:03 compute-0 sudo[204491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:03 compute-0 podman[204453]: 2025-12-05 06:06:03.890400293 +0000 UTC m=+0.043523940 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Dec 05 06:06:04 compute-0 python3.9[204499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:04 compute-0 sudo[204491]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:04 compute-0 sudo[204575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onnexkusespmgxgkoabkzbuwllysjkez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914763.6737263-1852-137896124963429/AnsiballZ_file.py'
Dec 05 06:06:04 compute-0 sudo[204575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:04 compute-0 python3.9[204577]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:04 compute-0 sudo[204575]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:04 compute-0 sudo[204727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbncsrjchihhrmlztosfjgeguyhrxmvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914764.5005395-1876-65943665414055/AnsiballZ_stat.py'
Dec 05 06:06:04 compute-0 sudo[204727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:04 compute-0 python3.9[204729]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:04 compute-0 sudo[204727]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:05 compute-0 sudo[204805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pszedfffnvmouxtngelkehnenbfnsmjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914764.5005395-1876-65943665414055/AnsiballZ_file.py'
Dec 05 06:06:05 compute-0 sudo[204805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:05 compute-0 python3.9[204807]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:05 compute-0 sudo[204805]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:05 compute-0 sudo[204957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-garuatrddmswfbwaymmhschscxywnjdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914765.3179064-1900-262608104471412/AnsiballZ_stat.py'
Dec 05 06:06:05 compute-0 sudo[204957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:05 compute-0 python3.9[204959]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 05 06:06:05 compute-0 sudo[204957]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:05 compute-0 sudo[205082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgaeoqdtzpkxdzxtcbnvjixrbokxcbhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914765.3179064-1900-262608104471412/AnsiballZ_copy.py'
Dec 05 06:06:05 compute-0 sudo[205082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:06 compute-0 python3.9[205084]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764914765.3179064-1900-262608104471412/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:06 compute-0 sudo[205082]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:06 compute-0 sudo[205234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzaxmsqzwgzusefnfhaciuoklgjovtnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914766.2626848-1930-65348975513175/AnsiballZ_file.py'
Dec 05 06:06:06 compute-0 sudo[205234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:06 compute-0 python3.9[205236]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:06 compute-0 sudo[205234]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:06 compute-0 sudo[205386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swbhnljuvqfyljveyzeuuqhyjqitapea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914766.7361066-1946-43613661273221/AnsiballZ_command.py'
Dec 05 06:06:06 compute-0 sudo[205386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:07 compute-0 python3.9[205388]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:06:07 compute-0 sudo[205386]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:07 compute-0 sudo[205541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lystoxhtccawfjrvgqfvfswesmylqyvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914767.232033-1962-152173065798391/AnsiballZ_blockinfile.py'
Dec 05 06:06:07 compute-0 sudo[205541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:07 compute-0 python3.9[205543]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:07 compute-0 sudo[205541]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:08 compute-0 sudo[205693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdtsmillskbmzpeltjqdckgjeztqdakh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914767.937612-1980-162336169756818/AnsiballZ_command.py'
Dec 05 06:06:08 compute-0 sudo[205693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:08 compute-0 python3.9[205695]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:06:08 compute-0 sudo[205693]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:08 compute-0 sudo[205846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otuegnfsjhdxocvmfuzaguhycyoecffy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914768.4243016-1996-84352293242759/AnsiballZ_stat.py'
Dec 05 06:06:08 compute-0 sudo[205846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:08 compute-0 python3.9[205848]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 05 06:06:08 compute-0 sudo[205846]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:09 compute-0 sudo[206000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exvcxongymsnyqejzjdujnxttnrmqvmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914768.8990288-2012-134170147034434/AnsiballZ_command.py'
Dec 05 06:06:09 compute-0 sudo[206000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:09 compute-0 python3.9[206002]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 05 06:06:09 compute-0 sudo[206000]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:09 compute-0 sudo[206155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffvveqrfvzqffqlsavgjydudnqztwndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1764914769.4219093-2028-277500468630982/AnsiballZ_file.py'
Dec 05 06:06:09 compute-0 sudo[206155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:06:09 compute-0 python3.9[206157]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 05 06:06:09 compute-0 sudo[206155]: pam_unix(sudo:session): session closed for user root
Dec 05 06:06:10 compute-0 sshd-session[186675]: Connection closed by 192.168.122.30 port 55452
Dec 05 06:06:10 compute-0 sshd-session[186672]: pam_unix(sshd:session): session closed for user zuul
Dec 05 06:06:10 compute-0 systemd-logind[745]: Session 25 logged out. Waiting for processes to exit.
Dec 05 06:06:10 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Dec 05 06:06:10 compute-0 systemd[1]: session-25.scope: Consumed 54.561s CPU time.
Dec 05 06:06:10 compute-0 systemd-logind[745]: Removed session 25.
Dec 05 06:06:25 compute-0 podman[206182]: 2025-12-05 06:06:25.448537282 +0000 UTC m=+0.035206486 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:06:26 compute-0 podman[206204]: 2025-12-05 06:06:26.476817249 +0000 UTC m=+0.058742002 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:06:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:06:29.486 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:06:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:06:29.486 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:06:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:06:29.486 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:06:29 compute-0 podman[196599]: time="2025-12-05T06:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:06:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:06:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2553 "" "Go-http-client/1.1"
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: ERROR   06:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: ERROR   06:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: ERROR   06:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: ERROR   06:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: ERROR   06:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:06:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:06:33 compute-0 podman[206229]: 2025-12-05 06:06:33.45630548 +0000 UTC m=+0.041949252 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:06:33 compute-0 podman[206246]: 2025-12-05 06:06:33.508393122 +0000 UTC m=+0.037290030 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:06:34 compute-0 podman[206262]: 2025-12-05 06:06:34.474989704 +0000 UTC m=+0.061656014 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 05 06:06:37 compute-0 nova_compute[186329]: 2025-12-05 06:06:37.585 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:37 compute-0 nova_compute[186329]: 2025-12-05 06:06:37.585 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.090 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.090 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.091 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.601 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.601 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.602 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.602 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.810 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.810 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.822 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.011s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.822 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6210MB free_disk=73.20528411865234GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.822 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:06:38 compute-0 nova_compute[186329]: 2025-12-05 06:06:38.823 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:06:39 compute-0 nova_compute[186329]: 2025-12-05 06:06:39.894 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:06:39 compute-0 nova_compute[186329]: 2025-12-05 06:06:39.895 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:06:38 up 44 min,  0 user,  load average: 0.50, 0.71, 0.52\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:06:39 compute-0 nova_compute[186329]: 2025-12-05 06:06:39.951 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:06:40 compute-0 nova_compute[186329]: 2025-12-05 06:06:40.455 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:06:40 compute-0 nova_compute[186329]: 2025-12-05 06:06:40.961 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:06:40 compute-0 nova_compute[186329]: 2025-12-05 06:06:40.962 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.139s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:06:56 compute-0 podman[206281]: 2025-12-05 06:06:56.455318508 +0000 UTC m=+0.036345375 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:06:57 compute-0 podman[206303]: 2025-12-05 06:06:57.488624955 +0000 UTC m=+0.075709970 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: ERROR   06:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: ERROR   06:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: ERROR   06:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: ERROR   06:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: ERROR   06:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:07:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:07:04 compute-0 podman[206326]: 2025-12-05 06:07:04.447578918 +0000 UTC m=+0.033875677 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:07:04 compute-0 podman[206327]: 2025-12-05 06:07:04.452369606 +0000 UTC m=+0.037226852 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:07:05 compute-0 podman[206359]: 2025-12-05 06:07:05.448369124 +0000 UTC m=+0.035731303 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 06:07:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:08.759 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:07:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:08.760 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:07:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:08.761 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:07:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:11.872 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:8c:44 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-b0ae35a6-6c33-49fd-acf6-f76328368383', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b0ae35a6-6c33-49fd-acf6-f76328368383', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcef582be2274b9ba43451b49b4066ec', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be391817-a76f-4798-8001-b4cde62d24f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ec990ed7-6bcd-4364-9a8f-04a72390861a) old=Port_Binding(mac=['fa:16:3e:a6:8c:44'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b0ae35a6-6c33-49fd-acf6-f76328368383', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b0ae35a6-6c33-49fd-acf6-f76328368383', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fcef582be2274b9ba43451b49b4066ec', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:07:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:11.873 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ec990ed7-6bcd-4364-9a8f-04a72390861a in datapath b0ae35a6-6c33-49fd-acf6-f76328368383 updated
Dec 05 06:07:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:11.874 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b0ae35a6-6c33-49fd-acf6-f76328368383, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:07:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:11.874 104041 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp3umvhly2/privsep.sock']
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.448 104041 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.448 104041 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3umvhly2/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.361 206383 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.364 206383 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.365 206383 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.366 206383 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206383
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.450 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[844d2105-51f0-46ab-9ecf-95fa46346e9b]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.835 206383 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.835 206383 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:07:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:12.835 206383 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:07:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:13.202 206383 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 05 06:07:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:13.206 206383 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 05 06:07:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:13.239 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0b90ed30-cf07-49c4-8357-33c06ce8401c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:07:27 compute-0 podman[206388]: 2025-12-05 06:07:27.457125068 +0000 UTC m=+0.040144643 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:07:28 compute-0 podman[206410]: 2025-12-05 06:07:28.470007131 +0000 UTC m=+0.056939178 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:07:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:29.487 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:07:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:29.487 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:07:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:07:29.487 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:07:29 compute-0 podman[196599]: time="2025-12-05T06:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:07:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:07:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2557 "" "Go-http-client/1.1"
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: ERROR   06:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: ERROR   06:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: ERROR   06:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: ERROR   06:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:07:31 compute-0 openstack_network_exporter[198686]: ERROR   06:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:07:35 compute-0 podman[206434]: 2025-12-05 06:07:35.449272829 +0000 UTC m=+0.035639690 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:07:35 compute-0 podman[206435]: 2025-12-05 06:07:35.450334291 +0000 UTC m=+0.035029906 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:07:35 compute-0 podman[206468]: 2025-12-05 06:07:35.509372746 +0000 UTC m=+0.036888272 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.963 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.963 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.963 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:07:40 compute-0 nova_compute[186329]: 2025-12-05 06:07:40.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.471 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.472 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.472 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.472 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.640 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.641 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.651 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.651 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6121MB free_disk=73.20505905151367GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.651 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:07:41 compute-0 nova_compute[186329]: 2025-12-05 06:07:41.652 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:07:42 compute-0 nova_compute[186329]: 2025-12-05 06:07:42.681 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:07:42 compute-0 nova_compute[186329]: 2025-12-05 06:07:42.682 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:07:41 up 45 min,  0 user,  load average: 0.23, 0.59, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:07:42 compute-0 nova_compute[186329]: 2025-12-05 06:07:42.703 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:07:43 compute-0 nova_compute[186329]: 2025-12-05 06:07:43.209 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:07:43 compute-0 nova_compute[186329]: 2025-12-05 06:07:43.715 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:07:43 compute-0 nova_compute[186329]: 2025-12-05 06:07:43.715 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:07:58 compute-0 podman[206487]: 2025-12-05 06:07:58.445299075 +0000 UTC m=+0.032664566 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:07:59 compute-0 podman[206507]: 2025-12-05 06:07:59.463348073 +0000 UTC m=+0.050721320 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 05 06:07:59 compute-0 podman[196599]: time="2025-12-05T06:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:07:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:07:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2560 "" "Go-http-client/1.1"
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: ERROR   06:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: ERROR   06:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: ERROR   06:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: ERROR   06:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: ERROR   06:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:08:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:08:06 compute-0 podman[206531]: 2025-12-05 06:08:06.454352769 +0000 UTC m=+0.038922128 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 05 06:08:06 compute-0 podman[206530]: 2025-12-05 06:08:06.45939458 +0000 UTC m=+0.045707451 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:08:06 compute-0 podman[206532]: 2025-12-05 06:08:06.460547984 +0000 UTC m=+0.043476515 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 05 06:08:29 compute-0 podman[206584]: 2025-12-05 06:08:29.449346249 +0000 UTC m=+0.035386854 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:08:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:08:29.488 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:08:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:08:29.488 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:08:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:08:29.488 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:08:29 compute-0 podman[196599]: time="2025-12-05T06:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:08:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:08:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2561 "" "Go-http-client/1.1"
Dec 05 06:08:30 compute-0 podman[206606]: 2025-12-05 06:08:30.460905731 +0000 UTC m=+0.048235195 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: ERROR   06:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: ERROR   06:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: ERROR   06:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: ERROR   06:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: ERROR   06:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:08:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:08:37 compute-0 podman[206630]: 2025-12-05 06:08:37.457244756 +0000 UTC m=+0.042185471 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.457 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:37 compute-0 podman[206631]: 2025-12-05 06:08:37.458948251 +0000 UTC m=+0.042135237 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:08:37 compute-0 podman[206629]: 2025-12-05 06:08:37.461342996 +0000 UTC m=+0.047521384 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.964 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.965 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.965 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.965 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:08:37 compute-0 nova_compute[186329]: 2025-12-05 06:08:37.965 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.475 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.475 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.475 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.476 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.640 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.641 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.650 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.651 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6135MB free_disk=73.20509338378906GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.651 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:08:38 compute-0 nova_compute[186329]: 2025-12-05 06:08:38.651 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:08:39 compute-0 nova_compute[186329]: 2025-12-05 06:08:39.683 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:08:39 compute-0 nova_compute[186329]: 2025-12-05 06:08:39.684 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:08:38 up 46 min,  0 user,  load average: 0.08, 0.48, 0.46\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:08:39 compute-0 nova_compute[186329]: 2025-12-05 06:08:39.700 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.205 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.710 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.711 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.958 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.958 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:40 compute-0 nova_compute[186329]: 2025-12-05 06:08:40.959 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:08:59 compute-0 podman[196599]: time="2025-12-05T06:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:08:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:08:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2559 "" "Go-http-client/1.1"
Dec 05 06:09:00 compute-0 podman[206680]: 2025-12-05 06:09:00.47130923 +0000 UTC m=+0.058726412 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:09:00 compute-0 podman[206702]: 2025-12-05 06:09:00.543559297 +0000 UTC m=+0.054128253 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.4)
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: ERROR   06:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: ERROR   06:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: ERROR   06:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: ERROR   06:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: ERROR   06:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:09:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:09:08 compute-0 podman[206727]: 2025-12-05 06:09:08.458465813 +0000 UTC m=+0.041945229 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Dec 05 06:09:08 compute-0 podman[206726]: 2025-12-05 06:09:08.45851171 +0000 UTC m=+0.044042515 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:09:08 compute-0 podman[206728]: 2025-12-05 06:09:08.464450864 +0000 UTC m=+0.045805633 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:09:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:29.488 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:09:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:29.489 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:09:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:29.489 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:09:29 compute-0 podman[196599]: time="2025-12-05T06:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:09:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:09:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2565 "" "Go-http-client/1.1"
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: ERROR   06:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: ERROR   06:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: ERROR   06:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: ERROR   06:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: ERROR   06:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:09:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:09:31 compute-0 podman[206781]: 2025-12-05 06:09:31.464439917 +0000 UTC m=+0.039461594 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:09:31 compute-0 podman[206780]: 2025-12-05 06:09:31.479426441 +0000 UTC m=+0.061225931 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:09:34 compute-0 nova_compute[186329]: 2025-12-05 06:09:34.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:34 compute-0 nova_compute[186329]: 2025-12-05 06:09:34.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:09:35 compute-0 nova_compute[186329]: 2025-12-05 06:09:35.214 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:09:35 compute-0 nova_compute[186329]: 2025-12-05 06:09:35.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:35 compute-0 nova_compute[186329]: 2025-12-05 06:09:35.215 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:09:35 compute-0 nova_compute[186329]: 2025-12-05 06:09:35.718 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:37.480 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:09:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:37.481 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:09:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:37.482 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:09:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:38.092 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:4a:2c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e6bd0d8a-7766-43e8-93f1-3ac74a19945f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6bd0d8a-7766-43e8-93f1-3ac74a19945f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d7b7b597d1f4b498f969bf9d1f11007', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23a782e6-eaff-4d06-b579-baf357abc95e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c10ef85a-1eab-4a1b-afeb-00834d5745e7) old=Port_Binding(mac=['fa:16:3e:18:4a:2c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e6bd0d8a-7766-43e8-93f1-3ac74a19945f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e6bd0d8a-7766-43e8-93f1-3ac74a19945f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d7b7b597d1f4b498f969bf9d1f11007', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:09:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:38.092 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c10ef85a-1eab-4a1b-afeb-00834d5745e7 in datapath e6bd0d8a-7766-43e8-93f1-3ac74a19945f updated
Dec 05 06:09:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:38.093 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e6bd0d8a-7766-43e8-93f1-3ac74a19945f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:09:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:38.093 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a268ba70-9b1a-4ffc-ab7b-c99609ab655d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:09:39 compute-0 podman[206825]: 2025-12-05 06:09:39.445484124 +0000 UTC m=+0.031639289 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Dec 05 06:09:39 compute-0 podman[206827]: 2025-12-05 06:09:39.458340858 +0000 UTC m=+0.040707124 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 05 06:09:39 compute-0 podman[206826]: 2025-12-05 06:09:39.465384019 +0000 UTC m=+0.050007475 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7)
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.222 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.222 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.222 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.222 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.223 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.223 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.223 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.733 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.733 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.733 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.733 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.905 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.905 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.915 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.915 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6113MB free_disk=73.20509338378906GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.916 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:09:40 compute-0 nova_compute[186329]: 2025-12-05 06:09:40.916 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:09:41 compute-0 nova_compute[186329]: 2025-12-05 06:09:41.955 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:09:41 compute-0 nova_compute[186329]: 2025-12-05 06:09:41.956 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:09:40 up 47 min,  0 user,  load average: 0.03, 0.39, 0.43\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:09:41 compute-0 nova_compute[186329]: 2025-12-05 06:09:41.970 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:09:42 compute-0 nova_compute[186329]: 2025-12-05 06:09:42.475 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:09:42 compute-0 nova_compute[186329]: 2025-12-05 06:09:42.981 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:09:42 compute-0 nova_compute[186329]: 2025-12-05 06:09:42.982 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:09:43 compute-0 nova_compute[186329]: 2025-12-05 06:09:43.469 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:43 compute-0 nova_compute[186329]: 2025-12-05 06:09:43.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:09:44 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:44.823 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:c0:18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26b9487f4f4942b9b3efdb076d8acad1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9be86ee-be7c-4e7b-9ef6-2bb4e900c57d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9cbd763b-8dc1-4257-8503-1c18d1421abb) old=Port_Binding(mac=['fa:16:3e:8d:c0:18'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '26b9487f4f4942b9b3efdb076d8acad1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:09:44 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:44.824 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9cbd763b-8dc1-4257-8503-1c18d1421abb in datapath 718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d updated
Dec 05 06:09:44 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:44.824 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 718eaaf5-3a75-4d4b-883f-c23c3cb4ff6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:09:44 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:09:44.825 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a578d1-0a96-4b74-92da-b7a69393078e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:09:59 compute-0 podman[196599]: time="2025-12-05T06:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:09:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:09:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2563 "" "Go-http-client/1.1"
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: ERROR   06:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: ERROR   06:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: ERROR   06:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: ERROR   06:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: ERROR   06:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:10:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:10:01 compute-0 anacron[124195]: Job `cron.daily' started
Dec 05 06:10:01 compute-0 anacron[124195]: Job `cron.daily' terminated
Dec 05 06:10:02 compute-0 podman[206883]: 2025-12-05 06:10:02.457472538 +0000 UTC m=+0.041451744 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:10:02 compute-0 podman[206882]: 2025-12-05 06:10:02.477635847 +0000 UTC m=+0.064297401 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:10:10 compute-0 podman[206928]: 2025-12-05 06:10:10.467508709 +0000 UTC m=+0.045538972 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:10:10 compute-0 podman[206927]: 2025-12-05 06:10:10.476386636 +0000 UTC m=+0.056064795 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 06:10:10 compute-0 podman[206926]: 2025-12-05 06:10:10.485504375 +0000 UTC m=+0.068256016 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:10:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:29.489 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:10:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:29.489 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:10:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:29.489 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:10:29 compute-0 podman[196599]: time="2025-12-05T06:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:10:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:10:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2565 "" "Go-http-client/1.1"
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: ERROR   06:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: ERROR   06:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: ERROR   06:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: ERROR   06:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: ERROR   06:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:10:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:10:33 compute-0 podman[206978]: 2025-12-05 06:10:33.469552432 +0000 UTC m=+0.055048020 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:10:33 compute-0 podman[206979]: 2025-12-05 06:10:33.479487806 +0000 UTC m=+0.063067906 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:10:36 compute-0 nova_compute[186329]: 2025-12-05 06:10:36.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:38.596 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:10:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:38.596 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:10:39 compute-0 nova_compute[186329]: 2025-12-05 06:10:39.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:39 compute-0 nova_compute[186329]: 2025-12-05 06:10:39.711 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.224 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.392 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.393 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.403 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.403 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6115MB free_disk=73.20486068725586GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.404 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:10:40 compute-0 nova_compute[186329]: 2025-12-05 06:10:40.404 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:10:41 compute-0 podman[207026]: 2025-12-05 06:10:41.461350287 +0000 UTC m=+0.043488206 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 05 06:10:41 compute-0 podman[207024]: 2025-12-05 06:10:41.461362901 +0000 UTC m=+0.047161705 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 05 06:10:41 compute-0 podman[207025]: 2025-12-05 06:10:41.461366126 +0000 UTC m=+0.044662061 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.460 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.461 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:10:40 up 48 min,  0 user,  load average: 0.01, 0.32, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.494 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.525 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.526 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.534 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.546 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:10:41 compute-0 nova_compute[186329]: 2025-12-05 06:10:41.559 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:10:42 compute-0 nova_compute[186329]: 2025-12-05 06:10:42.063 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:10:42 compute-0 nova_compute[186329]: 2025-12-05 06:10:42.569 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:10:42 compute-0 nova_compute[186329]: 2025-12-05 06:10:42.569 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.165s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.568 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.568 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.569 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.569 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.569 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.569 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:10:43 compute-0 nova_compute[186329]: 2025-12-05 06:10:43.569 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:10:44 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:44.597 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:10:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:55.546 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:7c:f8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ddd843275bc4c4c9124163a55e82b69', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b4dbc03-2332-453d-bad7-decedc4781a2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0987f272-b86c-4b2e-8cfe-0c571c93fa52) old=Port_Binding(mac=['fa:16:3e:88:7c:f8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ddd843275bc4c4c9124163a55e82b69', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:10:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:55.547 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0987f272-b86c-4b2e-8cfe-0c571c93fa52 in datapath dfccf5a2-ddd2-463b-b875-016786dc54e3 updated
Dec 05 06:10:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:55.548 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dfccf5a2-ddd2-463b-b875-016786dc54e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:10:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:10:55.550 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a7e839-4118-4938-8a69-63eb08aa86d7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:10:59 compute-0 podman[196599]: time="2025-12-05T06:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:10:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:10:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2564 "" "Go-http-client/1.1"
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: ERROR   06:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: ERROR   06:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: ERROR   06:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: ERROR   06:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: ERROR   06:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:11:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:11:04 compute-0 podman[207077]: 2025-12-05 06:11:04.454458644 +0000 UTC m=+0.032315205 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:11:04 compute-0 podman[207076]: 2025-12-05 06:11:04.469809471 +0000 UTC m=+0.049073896 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202)
Dec 05 06:11:05 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:05.326 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:85:55 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1d7316d1-c82b-49da-8906-539c7c2f1fac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7316d1-c82b-49da-8906-539c7c2f1fac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7eb763694d8e480e9bb11451a932988d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1dc3b7a-beb0-43bd-a264-6f4f72f60ce3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c9d3bea8-a07d-4f5e-a2aa-666380a56a1d) old=Port_Binding(mac=['fa:16:3e:b5:85:55'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1d7316d1-c82b-49da-8906-539c7c2f1fac', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d7316d1-c82b-49da-8906-539c7c2f1fac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7eb763694d8e480e9bb11451a932988d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:11:05 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:05.326 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c9d3bea8-a07d-4f5e-a2aa-666380a56a1d in datapath 1d7316d1-c82b-49da-8906-539c7c2f1fac updated
Dec 05 06:11:05 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:05.327 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1d7316d1-c82b-49da-8906-539c7c2f1fac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:11:05 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:05.327 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[eedb011c-7d52-423b-b994-4ba8071f1ebd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:12 compute-0 podman[207121]: 2025-12-05 06:11:12.464379623 +0000 UTC m=+0.048313870 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Dec 05 06:11:12 compute-0 podman[207120]: 2025-12-05 06:11:12.486367078 +0000 UTC m=+0.071926147 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:11:12 compute-0 podman[207122]: 2025-12-05 06:11:12.486440635 +0000 UTC m=+0.068432093 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:11:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:29.490 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:29.491 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:29.491 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:29 compute-0 podman[196599]: time="2025-12-05T06:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:11:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:11:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2564 "" "Go-http-client/1.1"
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: ERROR   06:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: ERROR   06:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: ERROR   06:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: ERROR   06:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: ERROR   06:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:11:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:11:35 compute-0 podman[207175]: 2025-12-05 06:11:35.45641239 +0000 UTC m=+0.040796113 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:11:35 compute-0 podman[207174]: 2025-12-05 06:11:35.474555984 +0000 UTC m=+0.060551159 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:11:35 compute-0 nova_compute[186329]: 2025-12-05 06:11:35.530 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:35 compute-0 nova_compute[186329]: 2025-12-05 06:11:35.531 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:36 compute-0 nova_compute[186329]: 2025-12-05 06:11:36.034 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:11:36 compute-0 nova_compute[186329]: 2025-12-05 06:11:36.599 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:36 compute-0 nova_compute[186329]: 2025-12-05 06:11:36.599 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:36 compute-0 nova_compute[186329]: 2025-12-05 06:11:36.603 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:11:36 compute-0 nova_compute[186329]: 2025-12-05 06:11:36.603 186333 INFO nova.compute.claims [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:11:37 compute-0 nova_compute[186329]: 2025-12-05 06:11:37.638 186333 DEBUG nova.compute.provider_tree [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:11:38 compute-0 nova_compute[186329]: 2025-12-05 06:11:38.142 186333 DEBUG nova.scheduler.client.report [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:11:38 compute-0 nova_compute[186329]: 2025-12-05 06:11:38.648 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.048s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:38 compute-0 nova_compute[186329]: 2025-12-05 06:11:38.648 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.155 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.155 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.156 186333 WARNING neutronclient.v2_0.client [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.157 186333 WARNING neutronclient.v2_0.client [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.662 186333 INFO nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:11:39 compute-0 nova_compute[186329]: 2025-12-05 06:11:39.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:39.994 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:11:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:39.995 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.124 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Successfully created port: baa5cca1-f04c-460d-b613-48c1aef4ec5b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.167 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.215 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.215 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.215 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.216 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.381 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.381 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.391 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.010s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.392 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6120MB free_disk=73.20486068725586GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.392 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:40 compute-0 nova_compute[186329]: 2025-12-05 06:11:40.392 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:40.996 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.177 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.179 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.179 186333 INFO nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Creating image(s)
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.180 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.180 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.180 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.181 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.181 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.419 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 94466816-dfc5-4455-9992-20e7b47fddc3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.419 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.419 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:11:40 up 49 min,  0 user,  load average: 0.00, 0.26, 0.37\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_block_device_mapping': '1', 'num_os_type_None': '1', 'num_proj_7eb763694d8e480e9bb11451a932988d': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.448 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.953 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:11:41 compute-0 nova_compute[186329]: 2025-12-05 06:11:41.998 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.000 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.000 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.039 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.part --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.039 186333 DEBUG nova.virt.images [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] 6903ca06-7f44-4ad2-ab8b-0d16feef7d51 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.040 186333 DEBUG nova.privsep.utils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.040 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.part /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.093 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.part /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.converted" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.096 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.136 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b.converted --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.137 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.955s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.137 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.139 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.140 186333 INFO oslo.privsep.daemon [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp4vdsj6a6/privsep.sock']
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.459 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.459 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.067s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.713 186333 INFO oslo.privsep.daemon [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Spawned new privsep daemon via rootwrap
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.621 207240 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.624 207240 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.625 207240 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.625 207240 INFO oslo.privsep.daemon [-] privsep daemon running as pid 207240
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.775 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.824 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.825 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.825 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.825 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.828 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.828 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.876 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.877 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.894 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.895 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.895 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.934 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.934 186333 DEBUG nova.virt.disk.api [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Checking if we can resize image /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.934 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.973 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.974 186333 DEBUG nova.virt.disk.api [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Cannot resize image /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.974 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.974 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Ensure instance console log exists: /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.975 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.975 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:42 compute-0 nova_compute[186329]: 2025-12-05 06:11:42.975 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:43 compute-0 podman[207257]: 2025-12-05 06:11:43.456375058 +0000 UTC m=+0.044424793 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 05 06:11:43 compute-0 podman[207259]: 2025-12-05 06:11:43.465358556 +0000 UTC m=+0.049564714 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 05 06:11:43 compute-0 podman[207258]: 2025-12-05 06:11:43.491390839 +0000 UTC m=+0.077023910 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, version=9.6, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.565 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Successfully updated port: baa5cca1-f04c-460d-b613-48c1aef4ec5b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.618 186333 DEBUG nova.compute.manager [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-changed-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.618 186333 DEBUG nova.compute.manager [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Refreshing instance network info cache due to event network-changed-baa5cca1-f04c-460d-b613-48c1aef4ec5b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.618 186333 DEBUG oslo_concurrency.lockutils [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.619 186333 DEBUG oslo_concurrency.lockutils [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:11:43 compute-0 nova_compute[186329]: 2025-12-05 06:11:43.619 186333 DEBUG nova.network.neutron [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Refreshing network info cache for port baa5cca1-f04c-460d-b613-48c1aef4ec5b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.069 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.122 186333 WARNING neutronclient.v2_0.client [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.379 186333 DEBUG nova.network.neutron [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.455 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.455 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.455 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.456 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.456 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.456 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.456 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.526 186333 DEBUG nova.network.neutron [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:11:44 compute-0 nova_compute[186329]: 2025-12-05 06:11:44.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:11:45 compute-0 nova_compute[186329]: 2025-12-05 06:11:45.030 186333 DEBUG oslo_concurrency.lockutils [req-de3ec01c-6e26-47d6-872a-db3438db2df9 req-5ca2d9d9-e19b-462b-9a7d-91ee79a900e3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:11:45 compute-0 nova_compute[186329]: 2025-12-05 06:11:45.030 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquired lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:11:45 compute-0 nova_compute[186329]: 2025-12-05 06:11:45.031 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:11:46 compute-0 nova_compute[186329]: 2025-12-05 06:11:46.385 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:11:46 compute-0 nova_compute[186329]: 2025-12-05 06:11:46.575 186333 WARNING neutronclient.v2_0.client [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:46 compute-0 nova_compute[186329]: 2025-12-05 06:11:46.727 186333 DEBUG nova.network.neutron [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Updating instance_info_cache with network_info: [{"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.230 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Releasing lock "refresh_cache-94466816-dfc5-4455-9992-20e7b47fddc3" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.231 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance network_info: |[{"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.233 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Start _get_guest_xml network_info=[{"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.236 186333 WARNING nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.237 186333 DEBUG nova.virt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1616981358', uuid='94466816-dfc5-4455-9992-20e7b47fddc3'), owner=OwnerMeta(userid='2f839a901b074095a8ac81f9095a0a01', username='tempest-TestDataModel-775221203-project-admin', projectid='7eb763694d8e480e9bb11451a932988d', projectname='tempest-TestDataModel-775221203'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915107.237186) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.242 186333 DEBUG nova.virt.libvirt.host [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.242 186333 DEBUG nova.virt.libvirt.host [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.244 186333 DEBUG nova.virt.libvirt.host [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.245 186333 DEBUG nova.virt.libvirt.host [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.246 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.246 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.246 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.247 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.248 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.248 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.248 186333 DEBUG nova.virt.hardware [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.251 186333 DEBUG nova.privsep.utils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.251 186333 DEBUG nova.virt.libvirt.vif [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:11:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1616981358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1616981358',id=3,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7eb763694d8e480e9bb11451a932988d',ramdisk_id='',reservation_id='r-z0nypby8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-775221203',owner_user_name='tempest-TestDataModel-775221203-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:11:40Z,user_data=None,user_id='2f839a901b074095a8ac81f9095a0a01',uuid=94466816-dfc5-4455-9992-20e7b47fddc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.252 186333 DEBUG nova.network.os_vif_util [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converting VIF {"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.252 186333 DEBUG nova.network.os_vif_util [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.253 186333 DEBUG nova.objects.instance [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lazy-loading 'pci_devices' on Instance uuid 94466816-dfc5-4455-9992-20e7b47fddc3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.758 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <uuid>94466816-dfc5-4455-9992-20e7b47fddc3</uuid>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <name>instance-00000003</name>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:name>tempest-TestDataModel-server-1616981358</nova:name>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:11:47</nova:creationTime>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:11:47 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:11:47 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:user uuid="2f839a901b074095a8ac81f9095a0a01">tempest-TestDataModel-775221203-project-admin</nova:user>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:project uuid="7eb763694d8e480e9bb11451a932988d">tempest-TestDataModel-775221203</nova:project>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         <nova:port uuid="baa5cca1-f04c-460d-b613-48c1aef4ec5b">
Dec 05 06:11:47 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <system>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="serial">94466816-dfc5-4455-9992-20e7b47fddc3</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="uuid">94466816-dfc5-4455-9992-20e7b47fddc3</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </system>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <os>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </os>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <features>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </features>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.config"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:36:69:14"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <target dev="tapbaa5cca1-f0"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/console.log" append="off"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <video>
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </video>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:11:47 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:11:47 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:11:47 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:11:47 compute-0 nova_compute[186329]: </domain>
Dec 05 06:11:47 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.759 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Preparing to wait for external event network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.759 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.759 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.759 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.760 186333 DEBUG nova.virt.libvirt.vif [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:11:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1616981358',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1616981358',id=3,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7eb763694d8e480e9bb11451a932988d',ramdisk_id='',reservation_id='r-z0nypby8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-775221203',owner_user_name='tempest-TestDataModel-775221203-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:11:40Z,user_data=None,user_id='2f839a901b074095a8ac81f9095a0a01',uuid=94466816-dfc5-4455-9992-20e7b47fddc3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.760 186333 DEBUG nova.network.os_vif_util [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converting VIF {"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.761 186333 DEBUG nova.network.os_vif_util [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.761 186333 DEBUG os_vif [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.787 186333 DEBUG ovsdbapp.backend.ovs_idl [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.787 186333 DEBUG ovsdbapp.backend.ovs_idl [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.787 186333 DEBUG ovsdbapp.backend.ovs_idl [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.788 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.788 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.788 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.789 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.795 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.796 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.796 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.796 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.796 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b4876fe9-84e4-57f5-b1bf-2ca01cee7ee2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.798 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:47 compute-0 nova_compute[186329]: 2025-12-05 06:11:47.799 186333 INFO oslo.privsep.daemon [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpgsnoo5za/privsep.sock']
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.378 186333 INFO oslo.privsep.daemon [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Spawned new privsep daemon via rootwrap
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.283 207315 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.286 207315 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.287 207315 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.288 207315 INFO oslo.privsep.daemon [-] privsep daemon running as pid 207315
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.563 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.586 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.586 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbaa5cca1-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.586 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbaa5cca1-f0, col_values=(('qos', UUID('c1a1b78a-ceb0-4d67-98d0-17d3efc9ca59')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.587 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbaa5cca1-f0, col_values=(('external_ids', {'iface-id': 'baa5cca1-f04c-460d-b613-48c1aef4ec5b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:69:14', 'vm-uuid': '94466816-dfc5-4455-9992-20e7b47fddc3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:48 compute-0 NetworkManager[55434]: <info>  [1764915108.5884] manager: (tapbaa5cca1-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.589 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.592 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:48 compute-0 nova_compute[186329]: 2025-12-05 06:11:48.593 186333 INFO os_vif [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0')
Dec 05 06:11:50 compute-0 nova_compute[186329]: 2025-12-05 06:11:50.116 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:11:50 compute-0 nova_compute[186329]: 2025-12-05 06:11:50.117 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:11:50 compute-0 nova_compute[186329]: 2025-12-05 06:11:50.117 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] No VIF found with MAC fa:16:3e:36:69:14, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:11:50 compute-0 nova_compute[186329]: 2025-12-05 06:11:50.117 186333 INFO nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Using config drive
Dec 05 06:11:50 compute-0 nova_compute[186329]: 2025-12-05 06:11:50.624 186333 WARNING neutronclient.v2_0.client [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.465 186333 INFO nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Creating config drive at /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.config
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.469 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp3gfgn2qb execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.586 186333 DEBUG oslo_concurrency.processutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp3gfgn2qb" returned: 0 in 0.117s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:11:51 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 05 06:11:51 compute-0 kernel: tapbaa5cca1-f0: entered promiscuous mode
Dec 05 06:11:51 compute-0 NetworkManager[55434]: <info>  [1764915111.6323] manager: (tapbaa5cca1-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec 05 06:11:51 compute-0 ovn_controller[95223]: 2025-12-05T06:11:51Z|00040|binding|INFO|Claiming lport baa5cca1-f04c-460d-b613-48c1aef4ec5b for this chassis.
Dec 05 06:11:51 compute-0 ovn_controller[95223]: 2025-12-05T06:11:51Z|00041|binding|INFO|baa5cca1-f04c-460d-b613-48c1aef4ec5b: Claiming fa:16:3e:36:69:14 10.100.0.13
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.636 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.637 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.644 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:69:14 10.100.0.13'], port_security=['fa:16:3e:36:69:14 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '94466816-dfc5-4455-9992-20e7b47fddc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7eb763694d8e480e9bb11451a932988d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40041ae0-167e-4825-897c-a48e17a7a798', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b4dbc03-2332-453d-bad7-decedc4781a2, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=baa5cca1-f04c-460d-b613-48c1aef4ec5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.644 104041 INFO neutron.agent.ovn.metadata.agent [-] Port baa5cca1-f04c-460d-b613-48c1aef4ec5b in datapath dfccf5a2-ddd2-463b-b875-016786dc54e3 bound to our chassis
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.645 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dfccf5a2-ddd2-463b-b875-016786dc54e3
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.660 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e8fbb268-ddbf-48f2-9974-342b5a7a0f99]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.661 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdfccf5a2-d1 in ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.663 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdfccf5a2-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.663 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[52623b51-d0c8-4aeb-a065-e8ce19170300]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:51 compute-0 systemd-udevd[207343]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.664 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[33ecbeef-6453-4c6e-b4b8-bd1b02f8c24e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:51 compute-0 NetworkManager[55434]: <info>  [1764915111.6757] device (tapbaa5cca1-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:11:51 compute-0 NetworkManager[55434]: <info>  [1764915111.6762] device (tapbaa5cca1-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.677 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[59ec3720-206b-41c4-870b-1957d2702b62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:51 compute-0 systemd-machined[152967]: New machine qemu-1-instance-00000003.
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.703 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1f5e9103-0c8e-4c30-96b5-cfc9eada5336]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:51 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:51.703 104041 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpbjijwd3v/privsep.sock']
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.704 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:51 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Dec 05 06:11:51 compute-0 ovn_controller[95223]: 2025-12-05T06:11:51Z|00042|binding|INFO|Setting lport baa5cca1-f04c-460d-b613-48c1aef4ec5b ovn-installed in OVS
Dec 05 06:11:51 compute-0 ovn_controller[95223]: 2025-12-05T06:11:51Z|00043|binding|INFO|Setting lport baa5cca1-f04c-460d-b613-48c1aef4ec5b up in Southbound
Dec 05 06:11:51 compute-0 nova_compute[186329]: 2025-12-05 06:11:51.711 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.290 104041 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.291 104041 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbjijwd3v/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.202 207374 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.205 207374 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.206 207374 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.206 207374 INFO oslo.privsep.daemon [-] privsep daemon running as pid 207374
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.292 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[24ec4330-077c-4491-9620-e9a4a6f481d8]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.472 186333 DEBUG nova.compute.manager [req-8a0db32c-2667-4bf0-9f8b-58bf10991b7a req-5c454ce6-dfde-4feb-94bd-345046c792d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.473 186333 DEBUG oslo_concurrency.lockutils [req-8a0db32c-2667-4bf0-9f8b-58bf10991b7a req-5c454ce6-dfde-4feb-94bd-345046c792d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.474 186333 DEBUG oslo_concurrency.lockutils [req-8a0db32c-2667-4bf0-9f8b-58bf10991b7a req-5c454ce6-dfde-4feb-94bd-345046c792d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.474 186333 DEBUG oslo_concurrency.lockutils [req-8a0db32c-2667-4bf0-9f8b-58bf10991b7a req-5c454ce6-dfde-4feb-94bd-345046c792d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.474 186333 DEBUG nova.compute.manager [req-8a0db32c-2667-4bf0-9f8b-58bf10991b7a req-5c454ce6-dfde-4feb-94bd-345046c792d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Processing event network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.475 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.483 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.485 186333 INFO nova.virt.libvirt.driver [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance spawned successfully.
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.485 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.687 207374 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.687 207374 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:52.687 207374 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.992 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.993 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.993 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.994 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.994 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:52 compute-0 nova_compute[186329]: 2025-12-05 06:11:52.995 186333 DEBUG nova.virt.libvirt.driver [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.067 207374 INFO oslo_service.backend [-] Loading backend: eventlet
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.072 207374 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.127 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e228e224-a7a2-4892-8f8d-ba107a4cc125]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 NetworkManager[55434]: <info>  [1764915113.1394] manager: (tapdfccf5a2-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.140 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ea0a93-c5b6-4e9d-983b-ec6f4142abd0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 systemd-udevd[207346]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.166 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[c799fba6-b5f1-47ad-8662-9f8ed92e17b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.170 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[9294d86b-0d51-414c-9a69-17ef65126d54]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 NetworkManager[55434]: <info>  [1764915113.1854] device (tapdfccf5a2-d0): carrier: link connected
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.191 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[da6fe778-8347-47ff-99d5-b49d549d6ab5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.204 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cf2a49f6-a477-48cb-b454-b22a56e2cd7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfccf5a2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:7c:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299172, 'reachable_time': 37846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207391, 'error': None, 'target': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.217 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a4509186-1897-445f-8346-acfc069f9fbf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:7cf8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299172, 'tstamp': 299172}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207392, 'error': None, 'target': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.230 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf40079-221b-46bb-80b0-3b858e05abf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdfccf5a2-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:7c:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299172, 'reachable_time': 37846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207393, 'error': None, 'target': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.253 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cca20f8c-ebc8-4582-9ade-7d585df3da41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.296 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[66ca9ad4-bf6d-4da6-9676-6d72a44d05cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.297 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfccf5a2-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.297 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.298 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfccf5a2-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:53 compute-0 kernel: tapdfccf5a2-d0: entered promiscuous mode
Dec 05 06:11:53 compute-0 NetworkManager[55434]: <info>  [1764915113.3012] manager: (tapdfccf5a2-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.302 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.303 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdfccf5a2-d0, col_values=(('external_ids', {'iface-id': '0987f272-b86c-4b2e-8cfe-0c571c93fa52'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:53 compute-0 ovn_controller[95223]: 2025-12-05T06:11:53Z|00044|binding|INFO|Releasing lport 0987f272-b86c-4b2e-8cfe-0c571c93fa52 from this chassis (sb_readonly=0)
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.304 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.306 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0b466456-1804-4be6-885d-06f84c3b7d7d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.312 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.312 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.312 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for dfccf5a2-ddd2-463b-b875-016786dc54e3 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.312 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.312 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d370f996-d57a-4ab1-ad7b-3400961f9bc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.313 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.313 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0df2fff1-82cf-4786-b739-173153b61f19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.313 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-dfccf5a2-ddd2-463b-b875-016786dc54e3
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID dfccf5a2-ddd2-463b-b875-016786dc54e3
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:11:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:53.313 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'env', 'PROCESS_TAG=haproxy-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dfccf5a2-ddd2-463b-b875-016786dc54e3.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.316 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.501 186333 INFO nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Took 12.32 seconds to spawn the instance on the hypervisor.
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.502 186333 DEBUG nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.564 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:53 compute-0 nova_compute[186329]: 2025-12-05 06:11:53.587 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:53 compute-0 podman[207422]: 2025-12-05 06:11:53.621112256 +0000 UTC m=+0.030870098 container create 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:11:53 compute-0 systemd[1]: Started libpod-conmon-3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7.scope.
Dec 05 06:11:53 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:11:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a946b933cda143c0e221ad5db8b56e5a085e798cf46db92ed4ce0f736c5d227/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:11:53 compute-0 podman[207422]: 2025-12-05 06:11:53.672264465 +0000 UTC m=+0.082022307 container init 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:11:53 compute-0 podman[207422]: 2025-12-05 06:11:53.676560201 +0000 UTC m=+0.086318043 container start 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 06:11:53 compute-0 podman[207422]: 2025-12-05 06:11:53.60749283 +0000 UTC m=+0.017250693 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:11:53 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [NOTICE]   (207438) : New worker (207440) forked
Dec 05 06:11:53 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [NOTICE]   (207438) : Loading success.
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.021 186333 INFO nova.compute.manager [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Took 17.48 seconds to build instance.
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.520 186333 DEBUG nova.compute.manager [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.521 186333 DEBUG oslo_concurrency.lockutils [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.522 186333 DEBUG oslo_concurrency.lockutils [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.522 186333 DEBUG oslo_concurrency.lockutils [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.522 186333 DEBUG nova.compute.manager [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] No waiting events found dispatching network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.522 186333 WARNING nova.compute.manager [req-bb22e854-a6cd-41ee-a387-21f24908bddb req-b09bb19f-62bf-4288-9525-4bb07ec5dc9b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received unexpected event network-vif-plugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b for instance with vm_state active and task_state None.
Dec 05 06:11:54 compute-0 nova_compute[186329]: 2025-12-05 06:11:54.524 186333 DEBUG oslo_concurrency.lockutils [None req-ae471e47-0488-4f22-bf5e-37336708abfc 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.993s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.055 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.056 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.057 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.057 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.057 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.064 186333 INFO nova.compute.manager [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Terminating instance
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.573 186333 DEBUG nova.compute.manager [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:11:57 compute-0 kernel: tapbaa5cca1-f0 (unregistering): left promiscuous mode
Dec 05 06:11:57 compute-0 NetworkManager[55434]: <info>  [1764915117.5910] device (tapbaa5cca1-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.598 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:57 compute-0 ovn_controller[95223]: 2025-12-05T06:11:57Z|00045|binding|INFO|Releasing lport baa5cca1-f04c-460d-b613-48c1aef4ec5b from this chassis (sb_readonly=0)
Dec 05 06:11:57 compute-0 ovn_controller[95223]: 2025-12-05T06:11:57Z|00046|binding|INFO|Setting lport baa5cca1-f04c-460d-b613-48c1aef4ec5b down in Southbound
Dec 05 06:11:57 compute-0 ovn_controller[95223]: 2025-12-05T06:11:57Z|00047|binding|INFO|Removing iface tapbaa5cca1-f0 ovn-installed in OVS
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.602 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.615 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:69:14 10.100.0.13'], port_security=['fa:16:3e:36:69:14 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '94466816-dfc5-4455-9992-20e7b47fddc3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7eb763694d8e480e9bb11451a932988d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '40041ae0-167e-4825-897c-a48e17a7a798', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b4dbc03-2332-453d-bad7-decedc4781a2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=baa5cca1-f04c-460d-b613-48c1aef4ec5b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.615 104041 INFO neutron.agent.ovn.metadata.agent [-] Port baa5cca1-f04c-460d-b613-48c1aef4ec5b in datapath dfccf5a2-ddd2-463b-b875-016786dc54e3 unbound from our chassis
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.616 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dfccf5a2-ddd2-463b-b875-016786dc54e3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.617 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.617 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[634cefe0-81ac-4402-bc20-9871af33e9d7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.617 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3 namespace which is not needed anymore
Dec 05 06:11:57 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 05 06:11:57 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 5.517s CPU time.
Dec 05 06:11:57 compute-0 systemd-machined[152967]: Machine qemu-1-instance-00000003 terminated.
Dec 05 06:11:57 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [NOTICE]   (207438) : haproxy version is 3.0.5-8e879a5
Dec 05 06:11:57 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [NOTICE]   (207438) : path to executable is /usr/sbin/haproxy
Dec 05 06:11:57 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [WARNING]  (207438) : Exiting Master process...
Dec 05 06:11:57 compute-0 podman[207467]: 2025-12-05 06:11:57.696190758 +0000 UTC m=+0.021694909 container kill 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:11:57 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [ALERT]    (207438) : Current worker (207440) exited with code 143 (Terminated)
Dec 05 06:11:57 compute-0 neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3[207434]: [WARNING]  (207438) : All workers exited. Exiting... (0)
Dec 05 06:11:57 compute-0 systemd[1]: libpod-3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7.scope: Deactivated successfully.
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.699 186333 DEBUG nova.compute.manager [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.699 186333 DEBUG oslo_concurrency.lockutils [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.700 186333 DEBUG oslo_concurrency.lockutils [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.700 186333 DEBUG oslo_concurrency.lockutils [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.700 186333 DEBUG nova.compute.manager [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] No waiting events found dispatching network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.700 186333 DEBUG nova.compute.manager [req-c71cab32-5949-4c99-9ad1-f866105d00f5 req-c75af1d5-d4e1-47ef-87ec-4920271cb060 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:11:57 compute-0 podman[207480]: 2025-12-05 06:11:57.727080362 +0000 UTC m=+0.016218363 container died 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:11:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7-userdata-shm.mount: Deactivated successfully.
Dec 05 06:11:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-0a946b933cda143c0e221ad5db8b56e5a085e798cf46db92ed4ce0f736c5d227-merged.mount: Deactivated successfully.
Dec 05 06:11:57 compute-0 podman[207480]: 2025-12-05 06:11:57.746357313 +0000 UTC m=+0.035495305 container cleanup 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:11:57 compute-0 systemd[1]: libpod-conmon-3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7.scope: Deactivated successfully.
Dec 05 06:11:57 compute-0 podman[207481]: 2025-12-05 06:11:57.754483711 +0000 UTC m=+0.040870631 container remove 3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.764 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6f14715a-379c-453f-b2ee-da24a201ef2d]: (4, ("Fri Dec  5 06:11:57 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3 (3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7)\n3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7\nFri Dec  5 06:11:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3 (3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7)\n3ac9d7b4cccaf7ad771a5d4ecbb16e1a0e1a029eef48d3a1bb06a0b4dfd17ba7\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.765 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[854153f2-3a2c-4782-a270-f271fad872a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.765 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dfccf5a2-ddd2-463b-b875-016786dc54e3.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.766 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e26c44b4-0f53-4ebc-97f9-21fe63ab3425]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.766 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfccf5a2-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.767 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:57 compute-0 kernel: tapdfccf5a2-d0: left promiscuous mode
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.783 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:57 compute-0 NetworkManager[55434]: <info>  [1764915117.7854] manager: (tapbaa5cca1-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.785 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f77dff52-a6f5-4b8b-b2c0-78ed1880dc09]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.795 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ca12d9b1-6cfe-43cb-b84a-37f24eceb246]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.795 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fb3de7dc-71ed-4fe5-9fb5-865e13372aa1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.807 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7953db0e-c457-48c3-8327-7d6d769eb4aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299166, 'reachable_time': 27558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207517, 'error': None, 'target': 'ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 systemd[1]: run-netns-ovnmeta\x2ddfccf5a2\x2dddd2\x2d463b\x2db875\x2d016786dc54e3.mount: Deactivated successfully.
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.813 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dfccf5a2-ddd2-463b-b875-016786dc54e3 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:11:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:11:57.814 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[997967d0-52ab-49b7-a620-be0745352c71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.813 186333 INFO nova.virt.libvirt.driver [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Instance destroyed successfully.
Dec 05 06:11:57 compute-0 nova_compute[186329]: 2025-12-05 06:11:57.813 186333 DEBUG nova.objects.instance [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lazy-loading 'resources' on Instance uuid 94466816-dfc5-4455-9992-20e7b47fddc3 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.318 186333 DEBUG nova.virt.libvirt.vif [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:11:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1616981358',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1616981358',id=3,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:11:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7eb763694d8e480e9bb11451a932988d',ramdisk_id='',reservation_id='r-z0nypby8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-775221203',owner_user_name='tempest-TestDataModel-775221203-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:11:53Z,user_data=None,user_id='2f839a901b074095a8ac81f9095a0a01',uuid=94466816-dfc5-4455-9992-20e7b47fddc3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.318 186333 DEBUG nova.network.os_vif_util [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converting VIF {"id": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "address": "fa:16:3e:36:69:14", "network": {"id": "dfccf5a2-ddd2-463b-b875-016786dc54e3", "bridge": "br-int", "label": "tempest-TestDataModel-1925701889-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ddd843275bc4c4c9124163a55e82b69", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbaa5cca1-f0", "ovs_interfaceid": "baa5cca1-f04c-460d-b613-48c1aef4ec5b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.319 186333 DEBUG nova.network.os_vif_util [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.319 186333 DEBUG os_vif [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.320 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.321 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbaa5cca1-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.322 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.322 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.323 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.324 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.324 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c1a1b78a-ceb0-4d67-98d0-17d3efc9ca59) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.326 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.328 186333 INFO os_vif [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:69:14,bridge_name='br-int',has_traffic_filtering=True,id=baa5cca1-f04c-460d-b613-48c1aef4ec5b,network=Network(dfccf5a2-ddd2-463b-b875-016786dc54e3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbaa5cca1-f0')
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.328 186333 INFO nova.virt.libvirt.driver [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Deleting instance files /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3_del
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.329 186333 INFO nova.virt.libvirt.driver [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Deletion of /var/lib/nova/instances/94466816-dfc5-4455-9992-20e7b47fddc3_del complete
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.566 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.837 186333 INFO nova.compute.manager [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.837 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.837 186333 DEBUG nova.compute.manager [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.837 186333 DEBUG nova.network.neutron [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:11:58 compute-0 nova_compute[186329]: 2025-12-05 06:11:58.837 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.376 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:11:59 compute-0 podman[196599]: time="2025-12-05T06:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:11:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.750 186333 DEBUG nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.751 186333 DEBUG oslo_concurrency.lockutils [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.751 186333 DEBUG oslo_concurrency.lockutils [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.751 186333 DEBUG oslo_concurrency.lockutils [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.751 186333 DEBUG nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] No waiting events found dispatching network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:11:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.751 186333 DEBUG nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-unplugged-baa5cca1-f04c-460d-b613-48c1aef4ec5b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.753 186333 DEBUG nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Received event network-vif-deleted-baa5cca1-f04c-460d-b613-48c1aef4ec5b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.753 186333 INFO nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Neutron deleted interface baa5cca1-f04c-460d-b613-48c1aef4ec5b; detaching it from the instance and deleting it from the info cache
Dec 05 06:11:59 compute-0 nova_compute[186329]: 2025-12-05 06:11:59.753 186333 DEBUG nova.network.neutron [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:12:00 compute-0 nova_compute[186329]: 2025-12-05 06:12:00.067 186333 DEBUG nova.network.neutron [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:12:00 compute-0 nova_compute[186329]: 2025-12-05 06:12:00.258 186333 DEBUG nova.compute.manager [req-07225974-29c5-402e-981a-83f27b51cadc req-a10818db-e26e-4015-999c-b36c8228b319 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Detach interface failed, port_id=baa5cca1-f04c-460d-b613-48c1aef4ec5b, reason: Instance 94466816-dfc5-4455-9992-20e7b47fddc3 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:12:00 compute-0 nova_compute[186329]: 2025-12-05 06:12:00.573 186333 INFO nova.compute.manager [-] [instance: 94466816-dfc5-4455-9992-20e7b47fddc3] Took 1.74 seconds to deallocate network for instance.
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.085 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.085 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.125 186333 DEBUG nova.compute.provider_tree [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: ERROR   06:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: ERROR   06:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: ERROR   06:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: ERROR   06:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: ERROR   06:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:12:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.643 186333 ERROR nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] [req-e44d263d-7539-4f51-b08e-4e80369deba9] Failed to update inventory to [{'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID f2df025e-56e9-4920-9fad-1a12202c4aeb.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-e44d263d-7539-4f51-b08e-4e80369deba9"}]}
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.657 186333 DEBUG nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.667 186333 DEBUG nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.667 186333 DEBUG nova.compute.provider_tree [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.678 186333 DEBUG nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.691 186333 DEBUG nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:12:01 compute-0 nova_compute[186329]: 2025-12-05 06:12:01.718 186333 DEBUG nova.compute.provider_tree [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:12:02 compute-0 nova_compute[186329]: 2025-12-05 06:12:02.246 186333 DEBUG nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updated inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7681, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 4, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Dec 05 06:12:02 compute-0 nova_compute[186329]: 2025-12-05 06:12:02.247 186333 DEBUG nova.compute.provider_tree [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 05 06:12:02 compute-0 nova_compute[186329]: 2025-12-05 06:12:02.247 186333 DEBUG nova.compute.provider_tree [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:12:02 compute-0 nova_compute[186329]: 2025-12-05 06:12:02.752 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.666s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:12:02 compute-0 nova_compute[186329]: 2025-12-05 06:12:02.776 186333 INFO nova.scheduler.client.report [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Deleted allocations for instance 94466816-dfc5-4455-9992-20e7b47fddc3
Dec 05 06:12:03 compute-0 nova_compute[186329]: 2025-12-05 06:12:03.324 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:03 compute-0 nova_compute[186329]: 2025-12-05 06:12:03.567 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:03 compute-0 nova_compute[186329]: 2025-12-05 06:12:03.792 186333 DEBUG oslo_concurrency.lockutils [None req-f02da893-86f8-41a8-bc18-5aee6e7493e6 2f839a901b074095a8ac81f9095a0a01 7eb763694d8e480e9bb11451a932988d - - default default] Lock "94466816-dfc5-4455-9992-20e7b47fddc3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.735s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:12:06 compute-0 podman[207527]: 2025-12-05 06:12:06.460464824 +0000 UTC m=+0.044679391 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:12:06 compute-0 podman[207526]: 2025-12-05 06:12:06.480655062 +0000 UTC m=+0.065710299 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:12:08 compute-0 nova_compute[186329]: 2025-12-05 06:12:08.326 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:08 compute-0 nova_compute[186329]: 2025-12-05 06:12:08.567 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:13 compute-0 nova_compute[186329]: 2025-12-05 06:12:13.328 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:13 compute-0 nova_compute[186329]: 2025-12-05 06:12:13.568 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:14 compute-0 podman[207573]: 2025-12-05 06:12:14.460358302 +0000 UTC m=+0.043849470 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public)
Dec 05 06:12:14 compute-0 podman[207574]: 2025-12-05 06:12:14.461524375 +0000 UTC m=+0.043943768 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, io.buildah.version=1.41.4, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 06:12:14 compute-0 podman[207572]: 2025-12-05 06:12:14.483392939 +0000 UTC m=+0.069321138 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:12:16 compute-0 nova_compute[186329]: 2025-12-05 06:12:16.514 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:18 compute-0 nova_compute[186329]: 2025-12-05 06:12:18.328 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:18 compute-0 nova_compute[186329]: 2025-12-05 06:12:18.569 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:23 compute-0 nova_compute[186329]: 2025-12-05 06:12:23.330 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:23 compute-0 nova_compute[186329]: 2025-12-05 06:12:23.570 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:28 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:28.193 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:b0:d1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd726452471114afe8e8cd3a437713a5d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78329d2c-9d75-49ec-bc81-89ddda82582c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=476e02c7-6c32-4db2-b808-685677ca76e8) old=Port_Binding(mac=['fa:16:3e:6b:b0:d1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd726452471114afe8e8cd3a437713a5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:12:28 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:28.193 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 476e02c7-6c32-4db2-b808-685677ca76e8 in datapath a2dd081e-dc20-4351-ac59-ccdb3568905c updated
Dec 05 06:12:28 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:28.194 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2dd081e-dc20-4351-ac59-ccdb3568905c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:12:28 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:28.194 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2a0513d2-7e63-4a81-9edb-582d71986d4e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:12:28 compute-0 nova_compute[186329]: 2025-12-05 06:12:28.332 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:28 compute-0 nova_compute[186329]: 2025-12-05 06:12:28.571 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:29.491 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:12:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:29.492 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:12:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:29.492 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:12:29 compute-0 podman[196599]: time="2025-12-05T06:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:12:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:12:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: ERROR   06:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: ERROR   06:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: ERROR   06:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: ERROR   06:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: ERROR   06:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:12:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:12:33 compute-0 nova_compute[186329]: 2025-12-05 06:12:33.333 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:33 compute-0 nova_compute[186329]: 2025-12-05 06:12:33.572 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:35.318 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:dc:c7 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-39d1d053-fcf7-4247-9cea-55a413aabeaa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39d1d053-fcf7-4247-9cea-55a413aabeaa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=87371ff9-828c-4251-a063-1ca6bf7708c3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dcc3afce-d506-4f03-8b4b-b2641768dc2c) old=Port_Binding(mac=['fa:16:3e:ff:dc:c7'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-39d1d053-fcf7-4247-9cea-55a413aabeaa', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-39d1d053-fcf7-4247-9cea-55a413aabeaa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:12:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:35.318 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dcc3afce-d506-4f03-8b4b-b2641768dc2c in datapath 39d1d053-fcf7-4247-9cea-55a413aabeaa updated
Dec 05 06:12:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:35.319 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 39d1d053-fcf7-4247-9cea-55a413aabeaa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:12:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:35.319 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ccaae724-ac53-4640-9f91-912b1b1150ad]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:12:37 compute-0 podman[207624]: 2025-12-05 06:12:37.446888756 +0000 UTC m=+0.030443514 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:12:37 compute-0 podman[207623]: 2025-12-05 06:12:37.473580385 +0000 UTC m=+0.058665640 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:12:38 compute-0 nova_compute[186329]: 2025-12-05 06:12:38.335 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:38 compute-0 nova_compute[186329]: 2025-12-05 06:12:38.574 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:40 compute-0 nova_compute[186329]: 2025-12-05 06:12:40.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:41 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:41.444 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:12:41 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:41.445 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:12:41 compute-0 nova_compute[186329]: 2025-12-05 06:12:41.445 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:41 compute-0 nova_compute[186329]: 2025-12-05 06:12:41.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.225 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.403 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.404 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.416 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.416 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5897MB free_disk=73.17062759399414GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.417 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:12:42 compute-0 nova_compute[186329]: 2025-12-05 06:12:42.417 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.337 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.451 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.451 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:12:42 up 50 min,  0 user,  load average: 0.13, 0.26, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.465 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.573 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:43 compute-0 nova_compute[186329]: 2025-12-05 06:12:43.969 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:12:44 compute-0 nova_compute[186329]: 2025-12-05 06:12:44.475 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:12:44 compute-0 nova_compute[186329]: 2025-12-05 06:12:44.475 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:12:45 compute-0 podman[207673]: 2025-12-05 06:12:45.466416714 +0000 UTC m=+0.047518639 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:12:45 compute-0 podman[207672]: 2025-12-05 06:12:45.466514849 +0000 UTC m=+0.050658353 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, architecture=x86_64)
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:45 compute-0 nova_compute[186329]: 2025-12-05 06:12:45.471 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:12:45 compute-0 podman[207671]: 2025-12-05 06:12:45.48745358 +0000 UTC m=+0.073230687 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:12:46 compute-0 nova_compute[186329]: 2025-12-05 06:12:46.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:12:48 compute-0 nova_compute[186329]: 2025-12-05 06:12:48.339 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:48 compute-0 nova_compute[186329]: 2025-12-05 06:12:48.575 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:12:50.446 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:12:52 compute-0 ovn_controller[95223]: 2025-12-05T06:12:52Z|00048|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 06:12:53 compute-0 nova_compute[186329]: 2025-12-05 06:12:53.340 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:53 compute-0 nova_compute[186329]: 2025-12-05 06:12:53.575 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:58 compute-0 nova_compute[186329]: 2025-12-05 06:12:58.341 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:58 compute-0 nova_compute[186329]: 2025-12-05 06:12:58.577 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:12:59 compute-0 podman[196599]: time="2025-12-05T06:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:12:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:12:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: ERROR   06:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: ERROR   06:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: ERROR   06:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: ERROR   06:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: ERROR   06:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:13:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:13:03 compute-0 nova_compute[186329]: 2025-12-05 06:13:03.342 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:03 compute-0 nova_compute[186329]: 2025-12-05 06:13:03.577 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:08 compute-0 nova_compute[186329]: 2025-12-05 06:13:08.344 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:08 compute-0 podman[207725]: 2025-12-05 06:13:08.413556469 +0000 UTC m=+0.043022286 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:13:08 compute-0 podman[207724]: 2025-12-05 06:13:08.43720802 +0000 UTC m=+0.068139913 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller)
Dec 05 06:13:08 compute-0 nova_compute[186329]: 2025-12-05 06:13:08.579 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:13 compute-0 nova_compute[186329]: 2025-12-05 06:13:13.348 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:13 compute-0 nova_compute[186329]: 2025-12-05 06:13:13.581 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:16 compute-0 nova_compute[186329]: 2025-12-05 06:13:16.338 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:16 compute-0 nova_compute[186329]: 2025-12-05 06:13:16.339 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:16 compute-0 podman[207769]: 2025-12-05 06:13:16.483933528 +0000 UTC m=+0.069788705 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:13:16 compute-0 podman[207770]: 2025-12-05 06:13:16.490767558 +0000 UTC m=+0.075055137 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, architecture=x86_64, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, release=1755695350)
Dec 05 06:13:16 compute-0 podman[207771]: 2025-12-05 06:13:16.500520836 +0000 UTC m=+0.072357753 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 05 06:13:16 compute-0 nova_compute[186329]: 2025-12-05 06:13:16.842 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:13:17 compute-0 nova_compute[186329]: 2025-12-05 06:13:17.474 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:17 compute-0 nova_compute[186329]: 2025-12-05 06:13:17.475 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:17 compute-0 nova_compute[186329]: 2025-12-05 06:13:17.480 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:13:17 compute-0 nova_compute[186329]: 2025-12-05 06:13:17.480 186333 INFO nova.compute.claims [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:13:18 compute-0 nova_compute[186329]: 2025-12-05 06:13:18.351 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:18 compute-0 nova_compute[186329]: 2025-12-05 06:13:18.514 186333 DEBUG nova.compute.provider_tree [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:13:18 compute-0 nova_compute[186329]: 2025-12-05 06:13:18.583 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:19 compute-0 nova_compute[186329]: 2025-12-05 06:13:19.019 186333 DEBUG nova.scheduler.client.report [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:13:19 compute-0 nova_compute[186329]: 2025-12-05 06:13:19.525 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.050s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:19 compute-0 nova_compute[186329]: 2025-12-05 06:13:19.527 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.037 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.037 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.039 186333 WARNING neutronclient.v2_0.client [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.039 186333 WARNING neutronclient.v2_0.client [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.458 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Successfully created port: 7871a5b7-b713-4fab-810f-37a03f953665 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:13:20 compute-0 nova_compute[186329]: 2025-12-05 06:13:20.544 186333 INFO nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:13:21 compute-0 nova_compute[186329]: 2025-12-05 06:13:21.050 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.065 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.068 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.068 186333 INFO nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Creating image(s)
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.069 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.069 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.070 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.071 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.074 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.078 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.126 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.126 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.127 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.127 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.129 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.130 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.173 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.174 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.196 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.196 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.196 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.241 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.241 186333 DEBUG nova.virt.disk.api [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Checking if we can resize image /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.242 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.292 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.294 186333 DEBUG nova.virt.disk.api [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Cannot resize image /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.295 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.295 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Ensure instance console log exists: /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.295 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.296 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:22 compute-0 nova_compute[186329]: 2025-12-05 06:13:22.297 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:23 compute-0 nova_compute[186329]: 2025-12-05 06:13:23.353 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:23 compute-0 nova_compute[186329]: 2025-12-05 06:13:23.585 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.450 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Successfully updated port: 7871a5b7-b713-4fab-810f-37a03f953665 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.486 186333 DEBUG nova.compute.manager [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-changed-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.487 186333 DEBUG nova.compute.manager [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Refreshing instance network info cache due to event network-changed-7871a5b7-b713-4fab-810f-37a03f953665. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.487 186333 DEBUG oslo_concurrency.lockutils [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.487 186333 DEBUG oslo_concurrency.lockutils [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.487 186333 DEBUG nova.network.neutron [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Refreshing network info cache for port 7871a5b7-b713-4fab-810f-37a03f953665 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.963 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:13:24 compute-0 nova_compute[186329]: 2025-12-05 06:13:24.990 186333 WARNING neutronclient.v2_0.client [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:25 compute-0 nova_compute[186329]: 2025-12-05 06:13:25.437 186333 DEBUG nova.network.neutron [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:13:25 compute-0 nova_compute[186329]: 2025-12-05 06:13:25.543 186333 DEBUG nova.network.neutron [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:13:26 compute-0 nova_compute[186329]: 2025-12-05 06:13:26.050 186333 DEBUG oslo_concurrency.lockutils [req-e7e239c3-6778-4352-b6d4-f3c475de1d35 req-09563fa9-0319-4f8c-ae06-ce0f34d3eeab fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:13:26 compute-0 nova_compute[186329]: 2025-12-05 06:13:26.051 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquired lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:13:26 compute-0 nova_compute[186329]: 2025-12-05 06:13:26.051 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:13:27 compute-0 nova_compute[186329]: 2025-12-05 06:13:27.437 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:13:27 compute-0 nova_compute[186329]: 2025-12-05 06:13:27.604 186333 WARNING neutronclient.v2_0.client [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:28 compute-0 nova_compute[186329]: 2025-12-05 06:13:28.355 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:28 compute-0 nova_compute[186329]: 2025-12-05 06:13:28.519 186333 DEBUG nova.network.neutron [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Updating instance_info_cache with network_info: [{"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:13:28 compute-0 nova_compute[186329]: 2025-12-05 06:13:28.587 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.023 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Releasing lock "refresh_cache-496a64d6-66b3-43ee-9e98-466ec3fd223d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.024 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance network_info: |[{"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.026 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Start _get_guest_xml network_info=[{"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.029 186333 WARNING nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.031 186333 DEBUG nova.virt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-875708825', uuid='496a64d6-66b3-43ee-9e98-466ec3fd223d'), owner=OwnerMeta(userid='e5df4001be694d0a80e0436f215d8a10', username='tempest-TestExecuteActionsViaActuator-2083413180-project-admin', projectid='5e3ec7864ec74a8e9a98ea7d30769fb0', projectname='tempest-TestExecuteActionsViaActuator-2083413180'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915209.0309165) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.036 186333 DEBUG nova.virt.libvirt.host [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.036 186333 DEBUG nova.virt.libvirt.host [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.039 186333 DEBUG nova.virt.libvirt.host [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.039 186333 DEBUG nova.virt.libvirt.host [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.041 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.041 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.041 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.041 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.041 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.042 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.043 186333 DEBUG nova.virt.hardware [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.046 186333 DEBUG nova.virt.libvirt.vif [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:13:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-875708825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-875708825',id=5,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-7knweki8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:13:21Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=496a64d6-66b3-43ee-9e98-466ec3fd223d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.046 186333 DEBUG nova.network.os_vif_util [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converting VIF {"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.047 186333 DEBUG nova.network.os_vif_util [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.048 186333 DEBUG nova.objects.instance [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 496a64d6-66b3-43ee-9e98-466ec3fd223d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:13:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:29.492 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:29.493 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:29.493 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.555 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <uuid>496a64d6-66b3-43ee-9e98-466ec3fd223d</uuid>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <name>instance-00000005</name>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-875708825</nova:name>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:13:29</nova:creationTime>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:13:29 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:13:29 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:user uuid="e5df4001be694d0a80e0436f215d8a10">tempest-TestExecuteActionsViaActuator-2083413180-project-admin</nova:user>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:project uuid="5e3ec7864ec74a8e9a98ea7d30769fb0">tempest-TestExecuteActionsViaActuator-2083413180</nova:project>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         <nova:port uuid="7871a5b7-b713-4fab-810f-37a03f953665">
Dec 05 06:13:29 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <system>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="serial">496a64d6-66b3-43ee-9e98-466ec3fd223d</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="uuid">496a64d6-66b3-43ee-9e98-466ec3fd223d</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </system>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <os>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </os>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <features>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </features>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.config"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:0a:bd:1d"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <target dev="tap7871a5b7-b7"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/console.log" append="off"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <video>
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </video>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:13:29 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:13:29 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:13:29 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:13:29 compute-0 nova_compute[186329]: </domain>
Dec 05 06:13:29 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.558 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Preparing to wait for external event network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.558 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.559 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.559 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.559 186333 DEBUG nova.virt.libvirt.vif [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:13:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-875708825',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-875708825',id=5,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-7knweki8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:13:21Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=496a64d6-66b3-43ee-9e98-466ec3fd223d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.560 186333 DEBUG nova.network.os_vif_util [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converting VIF {"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.560 186333 DEBUG nova.network.os_vif_util [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.560 186333 DEBUG os_vif [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.561 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.561 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.561 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.562 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.562 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '13c2af4b-483b-524f-a901-4f09fdd7f30d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.563 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.565 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.566 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.566 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7871a5b7-b7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.567 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap7871a5b7-b7, col_values=(('qos', UUID('81d6aafc-aaea-41c4-b2b9-fa68b6467dec')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.567 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap7871a5b7-b7, col_values=(('external_ids', {'iface-id': '7871a5b7-b713-4fab-810f-37a03f953665', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:bd:1d', 'vm-uuid': '496a64d6-66b3-43ee-9e98-466ec3fd223d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.568 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.569 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:13:29 compute-0 NetworkManager[55434]: <info>  [1764915209.5722] manager: (tap7871a5b7-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.573 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:29 compute-0 nova_compute[186329]: 2025-12-05 06:13:29.573 186333 INFO os_vif [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7')
Dec 05 06:13:29 compute-0 podman[196599]: time="2025-12-05T06:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:13:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:13:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2575 "" "Go-http-client/1.1"
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.099 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.099 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.100 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] No VIF found with MAC fa:16:3e:0a:bd:1d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.100 186333 INFO nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Using config drive
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: ERROR   06:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: ERROR   06:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: ERROR   06:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: ERROR   06:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: ERROR   06:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:13:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.610 186333 WARNING neutronclient.v2_0.client [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.766 186333 INFO nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Creating config drive at /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.config
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.771 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5b4eblfz execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.891 186333 DEBUG oslo_concurrency.processutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp5b4eblfz" returned: 0 in 0.120s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:31 compute-0 kernel: tap7871a5b7-b7: entered promiscuous mode
Dec 05 06:13:31 compute-0 NetworkManager[55434]: <info>  [1764915211.9369] manager: (tap7871a5b7-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.938 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:31 compute-0 ovn_controller[95223]: 2025-12-05T06:13:31Z|00049|binding|INFO|Claiming lport 7871a5b7-b713-4fab-810f-37a03f953665 for this chassis.
Dec 05 06:13:31 compute-0 ovn_controller[95223]: 2025-12-05T06:13:31Z|00050|binding|INFO|7871a5b7-b713-4fab-810f-37a03f953665: Claiming fa:16:3e:0a:bd:1d 10.100.0.5
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.944 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.949 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:bd:1d 10.100.0.5'], port_security=['fa:16:3e:0a:bd:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '496a64d6-66b3-43ee-9e98-466ec3fd223d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4eeb11fa-86ff-4478-bf3d-491b18f116e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78329d2c-9d75-49ec-bc81-89ddda82582c, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=7871a5b7-b713-4fab-810f-37a03f953665) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.950 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 7871a5b7-b713-4fab-810f-37a03f953665 in datapath a2dd081e-dc20-4351-ac59-ccdb3568905c bound to our chassis
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.951 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.960 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[36a10de4-07a2-4231-a802-9f3251497659]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.960 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2dd081e-d1 in ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.966 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2dd081e-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.967 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ceaf8c04-251a-47b0-85cc-b677da88b336]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.967 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[22ab7bd1-a8cc-42b4-85fd-0150f62588c4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:31 compute-0 systemd-udevd[207856]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:13:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.976 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[b97c5a56-1e02-4310-bb43-5b7103da9bdc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:31 compute-0 NetworkManager[55434]: <info>  [1764915211.9837] device (tap7871a5b7-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:13:31 compute-0 NetworkManager[55434]: <info>  [1764915211.9848] device (tap7871a5b7-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:13:31 compute-0 nova_compute[186329]: 2025-12-05 06:13:31.999 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:31.998 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[929f8ef3-ef0b-4d10-9739-3aca9bc9765e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_controller[95223]: 2025-12-05T06:13:32Z|00051|binding|INFO|Setting lport 7871a5b7-b713-4fab-810f-37a03f953665 ovn-installed in OVS
Dec 05 06:13:32 compute-0 ovn_controller[95223]: 2025-12-05T06:13:32Z|00052|binding|INFO|Setting lport 7871a5b7-b713-4fab-810f-37a03f953665 up in Southbound
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.012 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:32 compute-0 systemd-machined[152967]: New machine qemu-2-instance-00000005.
Dec 05 06:13:32 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.034 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[5de7458a-4778-4d1b-86a7-e9320c16614a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 NetworkManager[55434]: <info>  [1764915212.0389] manager: (tapa2dd081e-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Dec 05 06:13:32 compute-0 systemd-udevd[207862]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.037 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5a8272-65fc-4a90-a45a-c4376a7b80e7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.070 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[83146e52-caee-42c6-9840-18b1d45059ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.072 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[9c798cd1-6251-476a-9b41-7d9b8f5071a1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 NetworkManager[55434]: <info>  [1764915212.0906] device (tapa2dd081e-d0): carrier: link connected
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.097 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[bcaed6a3-6dd8-4aa0-8b9e-e47e0a855b26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.111 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9cd31c-32e3-4dcf-8227-fa12e666a26a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2dd081e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:b0:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309062, 'reachable_time': 42628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207883, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.125 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[60e7bd27-3ae3-4af5-81e6-d0183a4e9bd5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:b0d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 309062, 'tstamp': 309062}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207884, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.127 186333 DEBUG nova.compute.manager [req-8c0195e3-c8c1-4396-8c71-e97e9699b033 req-8e8183d3-52ac-429f-9370-9211a9026de6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.128 186333 DEBUG oslo_concurrency.lockutils [req-8c0195e3-c8c1-4396-8c71-e97e9699b033 req-8e8183d3-52ac-429f-9370-9211a9026de6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.128 186333 DEBUG oslo_concurrency.lockutils [req-8c0195e3-c8c1-4396-8c71-e97e9699b033 req-8e8183d3-52ac-429f-9370-9211a9026de6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.128 186333 DEBUG oslo_concurrency.lockutils [req-8c0195e3-c8c1-4396-8c71-e97e9699b033 req-8e8183d3-52ac-429f-9370-9211a9026de6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.128 186333 DEBUG nova.compute.manager [req-8c0195e3-c8c1-4396-8c71-e97e9699b033 req-8e8183d3-52ac-429f-9370-9211a9026de6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Processing event network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.138 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0cab81f0-d283-4ae4-a9d7-bb29e4b7839b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2dd081e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:b0:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309062, 'reachable_time': 42628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207885, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.159 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[637e691c-e555-46d5-b1a1-d019c6209a66]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.207 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0ade94c7-1823-4fca-899e-90c89558d6c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.208 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2dd081e-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.208 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.208 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2dd081e-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.210 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:32 compute-0 NetworkManager[55434]: <info>  [1764915212.2113] manager: (tapa2dd081e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 05 06:13:32 compute-0 kernel: tapa2dd081e-d0: entered promiscuous mode
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.214 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2dd081e-d0, col_values=(('external_ids', {'iface-id': '476e02c7-6c32-4db2-b808-685677ca76e8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.215 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:32 compute-0 ovn_controller[95223]: 2025-12-05T06:13:32Z|00053|binding|INFO|Releasing lport 476e02c7-6c32-4db2-b808-685677ca76e8 from this chassis (sb_readonly=0)
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.226 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.228 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[84d42e2b-8972-43d9-aef3-53d20a60540e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.228 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.228 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.228 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a2dd081e-dc20-4351-ac59-ccdb3568905c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.229 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.229 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6371fa8c-97fc-486c-a048-6744e3203918]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.229 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.229 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e59b2ed5-e90e-47df-9cdd-e0a8323b8e2c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.230 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:13:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:32.230 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'env', 'PROCESS_TAG=haproxy-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2dd081e-dc20-4351-ac59-ccdb3568905c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:13:32 compute-0 podman[207915]: 2025-12-05 06:13:32.546583958 +0000 UTC m=+0.036816137 container create 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:13:32 compute-0 systemd[1]: Started libpod-conmon-5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73.scope.
Dec 05 06:13:32 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:13:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00f7222b0cdd07ab01701473efe1fe9aad6db85ab935835f154dab7cac8d1090/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.609 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:13:32 compute-0 podman[207915]: 2025-12-05 06:13:32.612061296 +0000 UTC m=+0.102293473 container init 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.614 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.618 186333 INFO nova.virt.libvirt.driver [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance spawned successfully.
Dec 05 06:13:32 compute-0 podman[207915]: 2025-12-05 06:13:32.619570755 +0000 UTC m=+0.109802933 container start 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:13:32 compute-0 nova_compute[186329]: 2025-12-05 06:13:32.618 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:13:32 compute-0 podman[207915]: 2025-12-05 06:13:32.531745097 +0000 UTC m=+0.021977295 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:13:32 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [NOTICE]   (207938) : New worker (207940) forked
Dec 05 06:13:32 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [NOTICE]   (207938) : Loading success.
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.141 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.144 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.145 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.145 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.145 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.146 186333 DEBUG nova.virt.libvirt.driver [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.599 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.654 186333 INFO nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Took 11.59 seconds to spawn the instance on the hypervisor.
Dec 05 06:13:33 compute-0 nova_compute[186329]: 2025-12-05 06:13:33.654 186333 DEBUG nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.180 186333 DEBUG nova.compute.manager [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.182 186333 DEBUG oslo_concurrency.lockutils [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.183 186333 DEBUG oslo_concurrency.lockutils [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.183 186333 DEBUG oslo_concurrency.lockutils [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.184 186333 DEBUG nova.compute.manager [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] No waiting events found dispatching network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.184 186333 WARNING nova.compute.manager [req-293951fc-5b4f-4dd1-9846-127a69863934 req-9bf4c20a-5bad-4a81-b3c2-4610d56081cd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received unexpected event network-vif-plugged-7871a5b7-b713-4fab-810f-37a03f953665 for instance with vm_state active and task_state None.
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.194 186333 INFO nova.compute.manager [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Took 16.85 seconds to build instance.
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.568 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:34 compute-0 nova_compute[186329]: 2025-12-05 06:13:34.699 186333 DEBUG oslo_concurrency.lockutils [None req-aeb047ac-977c-400f-9da6-34d9199ef2e8 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.360s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:38 compute-0 nova_compute[186329]: 2025-12-05 06:13:38.594 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:39 compute-0 podman[207946]: 2025-12-05 06:13:39.462378915 +0000 UTC m=+0.045532576 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:13:39 compute-0 podman[207945]: 2025-12-05 06:13:39.48172458 +0000 UTC m=+0.066473040 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:13:39 compute-0 nova_compute[186329]: 2025-12-05 06:13:39.571 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:40 compute-0 nova_compute[186329]: 2025-12-05 06:13:40.964 186333 DEBUG nova.compute.manager [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6173
Dec 05 06:13:41 compute-0 nova_compute[186329]: 2025-12-05 06:13:41.492 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:41 compute-0 nova_compute[186329]: 2025-12-05 06:13:41.492 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:41 compute-0 nova_compute[186329]: 2025-12-05 06:13:41.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:42 compute-0 nova_compute[186329]: 2025-12-05 06:13:42.003 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:13:42 compute-0 nova_compute[186329]: 2025-12-05 06:13:42.004 186333 INFO nova.compute.claims [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:13:42 compute-0 nova_compute[186329]: 2025-12-05 06:13:42.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:42 compute-0 nova_compute[186329]: 2025-12-05 06:13:42.510 186333 INFO nova.compute.resource_tracker [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating resource usage from migration 14b1a4b9-e3ac-4d2c-9342-40e3b5aeccb3
Dec 05 06:13:42 compute-0 nova_compute[186329]: 2025-12-05 06:13:42.511 186333 DEBUG nova.compute.resource_tracker [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Starting to track incoming migration 14b1a4b9-e3ac-4d2c-9342-40e3b5aeccb3 with flavor da5dfa03-1fbb-4783-97bb-d13dca9dd7f2 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:13:43 compute-0 nova_compute[186329]: 2025-12-05 06:13:43.053 186333 DEBUG nova.compute.provider_tree [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:13:43 compute-0 nova_compute[186329]: 2025-12-05 06:13:43.558 186333 DEBUG nova.scheduler.client.report [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:13:43 compute-0 nova_compute[186329]: 2025-12-05 06:13:43.595 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:44 compute-0 ovn_controller[95223]: 2025-12-05T06:13:44Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0a:bd:1d 10.100.0.5
Dec 05 06:13:44 compute-0 ovn_controller[95223]: 2025-12-05T06:13:44Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0a:bd:1d 10.100.0.5
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.065 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 2.573s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.066 186333 INFO nova.compute.manager [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Migrating
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.066 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.066 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.067 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 1.850s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.068 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.068 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.571 186333 INFO nova.compute.rpcapi [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.572 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:13:44 compute-0 nova_compute[186329]: 2025-12-05 06:13:44.577 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.093 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.136 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.137 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.189 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.370 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.372 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.386 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.387 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5738MB free_disk=73.14130401611328GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.387 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:45 compute-0 nova_compute[186329]: 2025-12-05 06:13:45.388 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:46 compute-0 nova_compute[186329]: 2025-12-05 06:13:46.401 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance 2f8d28d2-73a5-43e3-84d2-0e117d02d93a refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:13:46 compute-0 nova_compute[186329]: 2025-12-05 06:13:46.905 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating resource usage from migration 14b1a4b9-e3ac-4d2c-9342-40e3b5aeccb3
Dec 05 06:13:46 compute-0 nova_compute[186329]: 2025-12-05 06:13:46.906 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Starting to track incoming migration 14b1a4b9-e3ac-4d2c-9342-40e3b5aeccb3 with flavor da5dfa03-1fbb-4783-97bb-d13dca9dd7f2 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:13:46 compute-0 nova_compute[186329]: 2025-12-05 06:13:46.921 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 496a64d6-66b3-43ee-9e98-466ec3fd223d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:13:47 compute-0 nova_compute[186329]: 2025-12-05 06:13:47.425 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 2f8d28d2-73a5-43e3-84d2-0e117d02d93a has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.
Dec 05 06:13:47 compute-0 nova_compute[186329]: 2025-12-05 06:13:47.426 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:13:47 compute-0 nova_compute[186329]: 2025-12-05 06:13:47.426 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:13:45 up 51 min,  0 user,  load average: 0.47, 0.31, 0.37\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5e3ec7864ec74a8e9a98ea7d30769fb0': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:13:47 compute-0 nova_compute[186329]: 2025-12-05 06:13:47.462 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:13:47 compute-0 podman[208010]: 2025-12-05 06:13:47.464490926 +0000 UTC m=+0.044485300 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 05 06:13:47 compute-0 podman[208011]: 2025-12-05 06:13:47.482575726 +0000 UTC m=+0.060698701 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container)
Dec 05 06:13:47 compute-0 podman[208012]: 2025-12-05 06:13:47.503444543 +0000 UTC m=+0.079095859 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:13:47 compute-0 nova_compute[186329]: 2025-12-05 06:13:47.967 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:13:48 compute-0 nova_compute[186329]: 2025-12-05 06:13:48.473 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:13:48 compute-0 nova_compute[186329]: 2025-12-05 06:13:48.474 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:48 compute-0 nova_compute[186329]: 2025-12-05 06:13:48.596 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.469 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.470 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.471 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.471 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:13:49 compute-0 nova_compute[186329]: 2025-12-05 06:13:49.578 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:49 compute-0 sshd-session[208061]: Accepted publickey for nova from 192.168.122.101 port 49494 ssh2: ECDSA SHA256:8rTbzXOcH8CdlF3YzY7JfZvES5fJu9+6SCscv8SkECA
Dec 05 06:13:49 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Dec 05 06:13:49 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec 05 06:13:49 compute-0 systemd-logind[745]: New session 26 of user nova.
Dec 05 06:13:49 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec 05 06:13:49 compute-0 systemd[1]: Starting User Manager for UID 42436...
Dec 05 06:13:49 compute-0 systemd[208065]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:49 compute-0 systemd[208065]: Queued start job for default target Main User Target.
Dec 05 06:13:49 compute-0 systemd[208065]: Created slice User Application Slice.
Dec 05 06:13:49 compute-0 systemd[208065]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 05 06:13:49 compute-0 systemd[208065]: Started Daily Cleanup of User's Temporary Directories.
Dec 05 06:13:49 compute-0 systemd[208065]: Reached target Paths.
Dec 05 06:13:49 compute-0 systemd[208065]: Reached target Timers.
Dec 05 06:13:49 compute-0 systemd[208065]: Starting D-Bus User Message Bus Socket...
Dec 05 06:13:49 compute-0 systemd[208065]: Starting Create User's Volatile Files and Directories...
Dec 05 06:13:49 compute-0 systemd[208065]: Listening on D-Bus User Message Bus Socket.
Dec 05 06:13:49 compute-0 systemd[208065]: Reached target Sockets.
Dec 05 06:13:49 compute-0 systemd[208065]: Finished Create User's Volatile Files and Directories.
Dec 05 06:13:49 compute-0 systemd[208065]: Reached target Basic System.
Dec 05 06:13:49 compute-0 systemd[208065]: Reached target Main User Target.
Dec 05 06:13:49 compute-0 systemd[208065]: Startup finished in 88ms.
Dec 05 06:13:49 compute-0 systemd[1]: Started User Manager for UID 42436.
Dec 05 06:13:49 compute-0 systemd[1]: Started Session 26 of User nova.
Dec 05 06:13:49 compute-0 sshd-session[208061]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:49 compute-0 sshd-session[208080]: Received disconnect from 192.168.122.101 port 49494:11: disconnected by user
Dec 05 06:13:49 compute-0 sshd-session[208080]: Disconnected from user nova 192.168.122.101 port 49494
Dec 05 06:13:49 compute-0 sshd-session[208061]: pam_unix(sshd:session): session closed for user nova
Dec 05 06:13:49 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Dec 05 06:13:49 compute-0 systemd-logind[745]: Session 26 logged out. Waiting for processes to exit.
Dec 05 06:13:49 compute-0 systemd-logind[745]: Removed session 26.
Dec 05 06:13:49 compute-0 sshd-session[208082]: Accepted publickey for nova from 192.168.122.101 port 49510 ssh2: ECDSA SHA256:8rTbzXOcH8CdlF3YzY7JfZvES5fJu9+6SCscv8SkECA
Dec 05 06:13:49 compute-0 systemd-logind[745]: New session 28 of user nova.
Dec 05 06:13:49 compute-0 systemd[1]: Started Session 28 of User nova.
Dec 05 06:13:49 compute-0 sshd-session[208082]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:50 compute-0 sshd-session[208085]: Received disconnect from 192.168.122.101 port 49510:11: disconnected by user
Dec 05 06:13:50 compute-0 sshd-session[208085]: Disconnected from user nova 192.168.122.101 port 49510
Dec 05 06:13:50 compute-0 sshd-session[208082]: pam_unix(sshd:session): session closed for user nova
Dec 05 06:13:50 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Dec 05 06:13:50 compute-0 systemd-logind[745]: Session 28 logged out. Waiting for processes to exit.
Dec 05 06:13:50 compute-0 systemd-logind[745]: Removed session 28.
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.553 186333 DEBUG nova.compute.manager [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.554 186333 DEBUG oslo_concurrency.lockutils [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.554 186333 DEBUG oslo_concurrency.lockutils [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.554 186333 DEBUG oslo_concurrency.lockutils [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.554 186333 DEBUG nova.compute.manager [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.554 186333 WARNING nova.compute.manager [req-69c6b384-202d-4f2c-83a5-35882ed226b9 req-1e233355-c2fc-494c-b415-5aa6636bb650 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state active and task_state resize_migrating.
Dec 05 06:13:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:52.575 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:13:52 compute-0 nova_compute[186329]: 2025-12-05 06:13:52.576 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:52.576 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:13:53 compute-0 sshd-session[208088]: Accepted publickey for nova from 192.168.122.101 port 49512 ssh2: ECDSA SHA256:8rTbzXOcH8CdlF3YzY7JfZvES5fJu9+6SCscv8SkECA
Dec 05 06:13:53 compute-0 systemd-logind[745]: New session 29 of user nova.
Dec 05 06:13:53 compute-0 systemd[1]: Started Session 29 of User nova.
Dec 05 06:13:53 compute-0 sshd-session[208088]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:53 compute-0 sshd-session[208091]: Received disconnect from 192.168.122.101 port 49512:11: disconnected by user
Dec 05 06:13:53 compute-0 sshd-session[208091]: Disconnected from user nova 192.168.122.101 port 49512
Dec 05 06:13:53 compute-0 sshd-session[208088]: pam_unix(sshd:session): session closed for user nova
Dec 05 06:13:53 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Session 29 logged out. Waiting for processes to exit.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Removed session 29.
Dec 05 06:13:53 compute-0 nova_compute[186329]: 2025-12-05 06:13:53.600 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:53 compute-0 sshd-session[208093]: Accepted publickey for nova from 192.168.122.101 port 49522 ssh2: ECDSA SHA256:8rTbzXOcH8CdlF3YzY7JfZvES5fJu9+6SCscv8SkECA
Dec 05 06:13:53 compute-0 systemd-logind[745]: New session 30 of user nova.
Dec 05 06:13:53 compute-0 systemd[1]: Started Session 30 of User nova.
Dec 05 06:13:53 compute-0 sshd-session[208093]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:53 compute-0 sshd-session[208096]: Received disconnect from 192.168.122.101 port 49522:11: disconnected by user
Dec 05 06:13:53 compute-0 sshd-session[208096]: Disconnected from user nova 192.168.122.101 port 49522
Dec 05 06:13:53 compute-0 sshd-session[208093]: pam_unix(sshd:session): session closed for user nova
Dec 05 06:13:53 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Session 30 logged out. Waiting for processes to exit.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Removed session 30.
Dec 05 06:13:53 compute-0 sshd-session[208098]: Accepted publickey for nova from 192.168.122.101 port 49536 ssh2: ECDSA SHA256:8rTbzXOcH8CdlF3YzY7JfZvES5fJu9+6SCscv8SkECA
Dec 05 06:13:53 compute-0 systemd-logind[745]: New session 31 of user nova.
Dec 05 06:13:53 compute-0 systemd[1]: Started Session 31 of User nova.
Dec 05 06:13:53 compute-0 sshd-session[208098]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Dec 05 06:13:53 compute-0 sshd-session[208101]: Received disconnect from 192.168.122.101 port 49536:11: disconnected by user
Dec 05 06:13:53 compute-0 sshd-session[208101]: Disconnected from user nova 192.168.122.101 port 49536
Dec 05 06:13:53 compute-0 sshd-session[208098]: pam_unix(sshd:session): session closed for user nova
Dec 05 06:13:53 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Session 31 logged out. Waiting for processes to exit.
Dec 05 06:13:53 compute-0 systemd-logind[745]: Removed session 31.
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.579 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.629 186333 DEBUG nova.compute.manager [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.629 186333 DEBUG oslo_concurrency.lockutils [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.630 186333 DEBUG oslo_concurrency.lockutils [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.630 186333 DEBUG oslo_concurrency.lockutils [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.630 186333 DEBUG nova.compute.manager [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:13:54 compute-0 nova_compute[186329]: 2025-12-05 06:13:54.630 186333 WARNING nova.compute.manager [req-8111ea38-fb70-4cc2-a263-00659926256f req-ee02adcc-101e-4932-b903-291f9caa257b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state active and task_state resize_migrating.
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.088 186333 WARNING neutronclient.v2_0.client [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.276 186333 INFO nova.network.neutron [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating port f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Dec 05 06:13:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:57.578 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.664 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.665 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.665 186333 DEBUG nova.network.neutron [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.705 186333 DEBUG nova.compute.manager [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-changed-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.705 186333 DEBUG nova.compute.manager [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Refreshing instance network info cache due to event network-changed-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.706 186333 DEBUG oslo_concurrency.lockutils [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.752 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.752 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.752 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.752 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.753 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:57 compute-0 nova_compute[186329]: 2025-12-05 06:13:57.759 186333 INFO nova.compute.manager [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Terminating instance
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.168 186333 WARNING neutronclient.v2_0.client [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.268 186333 DEBUG nova.compute.manager [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:13:58 compute-0 kernel: tap7871a5b7-b7 (unregistering): left promiscuous mode
Dec 05 06:13:58 compute-0 NetworkManager[55434]: <info>  [1764915238.2887] device (tap7871a5b7-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:13:58 compute-0 ovn_controller[95223]: 2025-12-05T06:13:58Z|00054|binding|INFO|Releasing lport 7871a5b7-b713-4fab-810f-37a03f953665 from this chassis (sb_readonly=0)
Dec 05 06:13:58 compute-0 ovn_controller[95223]: 2025-12-05T06:13:58Z|00055|binding|INFO|Setting lport 7871a5b7-b713-4fab-810f-37a03f953665 down in Southbound
Dec 05 06:13:58 compute-0 ovn_controller[95223]: 2025-12-05T06:13:58Z|00056|binding|INFO|Removing iface tap7871a5b7-b7 ovn-installed in OVS
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.295 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.299 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0a:bd:1d 10.100.0.5'], port_security=['fa:16:3e:0a:bd:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '496a64d6-66b3-43ee-9e98-466ec3fd223d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '4eeb11fa-86ff-4478-bf3d-491b18f116e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78329d2c-9d75-49ec-bc81-89ddda82582c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=7871a5b7-b713-4fab-810f-37a03f953665) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.300 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 7871a5b7-b713-4fab-810f-37a03f953665 in datapath a2dd081e-dc20-4351-ac59-ccdb3568905c unbound from our chassis
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.301 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2dd081e-dc20-4351-ac59-ccdb3568905c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.303 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[88b792f0-978b-4bd2-8e8a-3e67b7d00ebf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.304 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c namespace which is not needed anymore
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.302 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.312 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 05 06:13:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 11.644s CPU time.
Dec 05 06:13:58 compute-0 systemd-machined[152967]: Machine qemu-2-instance-00000005 terminated.
Dec 05 06:13:58 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [NOTICE]   (207938) : haproxy version is 3.0.5-8e879a5
Dec 05 06:13:58 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [NOTICE]   (207938) : path to executable is /usr/sbin/haproxy
Dec 05 06:13:58 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [WARNING]  (207938) : Exiting Master process...
Dec 05 06:13:58 compute-0 podman[208125]: 2025-12-05 06:13:58.407115657 +0000 UTC m=+0.026260035 container kill 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 06:13:58 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [ALERT]    (207938) : Current worker (207940) exited with code 143 (Terminated)
Dec 05 06:13:58 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[207933]: [WARNING]  (207938) : All workers exited. Exiting... (0)
Dec 05 06:13:58 compute-0 systemd[1]: libpod-5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73.scope: Deactivated successfully.
Dec 05 06:13:58 compute-0 podman[208136]: 2025-12-05 06:13:58.452654858 +0000 UTC m=+0.024701996 container died 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:13:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73-userdata-shm.mount: Deactivated successfully.
Dec 05 06:13:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-00f7222b0cdd07ab01701473efe1fe9aad6db85ab935835f154dab7cac8d1090-merged.mount: Deactivated successfully.
Dec 05 06:13:58 compute-0 podman[208136]: 2025-12-05 06:13:58.480532565 +0000 UTC m=+0.052579703 container cleanup 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4)
Dec 05 06:13:58 compute-0 systemd[1]: libpod-conmon-5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73.scope: Deactivated successfully.
Dec 05 06:13:58 compute-0 podman[208138]: 2025-12-05 06:13:58.495192804 +0000 UTC m=+0.058115653 container remove 5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0)
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.502 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e5debf-4f9f-428d-a44d-c5d701552f40]: (4, ("Fri Dec  5 06:13:58 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c (5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73)\n5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73\nFri Dec  5 06:13:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c (5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73)\n5e16bc081e761a41b69e8b3d184f5e52f02d4cfb8610d076a7e09e6873c35a73\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.504 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[04f67311-18dd-4476-a306-4a4aeebac144]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.504 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.505 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b40b7dc3-c236-4c2f-bf39-fd4cf081a19b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.505 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2dd081e-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.507 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 kernel: tapa2dd081e-d0: left promiscuous mode
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.520 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.522 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.523 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[49e883ab-336f-42a2-9faf-b06a30abf489]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.527 186333 INFO nova.virt.libvirt.driver [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Instance destroyed successfully.
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.527 186333 DEBUG nova.objects.instance [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lazy-loading 'resources' on Instance uuid 496a64d6-66b3-43ee-9e98-466ec3fd223d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.535 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d0670157-ec1d-4137-a1a0-8d499a8d0358]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.535 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[889ccc2c-8547-4323-9fd2-44226f77838e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.549 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f4f0d617-5cc6-4547-b1e8-c0f3aaf21135]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 309056, 'reachable_time': 26192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208182, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 systemd[1]: run-netns-ovnmeta\x2da2dd081e\x2ddc20\x2d4351\x2dac59\x2dccdb3568905c.mount: Deactivated successfully.
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.551 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:13:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:13:58.551 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[165002e0-00d4-4a3b-9712-c4448fad688b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.599 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.787 186333 WARNING neutronclient.v2_0.client [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:58 compute-0 nova_compute[186329]: 2025-12-05 06:13:58.977 186333 DEBUG nova.network.neutron [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating instance_info_cache with network_info: [{"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.031 186333 DEBUG nova.virt.libvirt.vif [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:13:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-875708825',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-875708825',id=5,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:13:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-7knweki8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:13:33Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=496a64d6-66b3-43ee-9e98-466ec3fd223d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.031 186333 DEBUG nova.network.os_vif_util [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converting VIF {"id": "7871a5b7-b713-4fab-810f-37a03f953665", "address": "fa:16:3e:0a:bd:1d", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7871a5b7-b7", "ovs_interfaceid": "7871a5b7-b713-4fab-810f-37a03f953665", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.031 186333 DEBUG nova.network.os_vif_util [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.032 186333 DEBUG os_vif [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.034 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.034 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7871a5b7-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.035 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.037 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.037 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.037 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=81d6aafc-aaea-41c4-b2b9-fa68b6467dec) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.041 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.043 186333 INFO os_vif [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:bd:1d,bridge_name='br-int',has_traffic_filtering=True,id=7871a5b7-b713-4fab-810f-37a03f953665,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7871a5b7-b7')
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.044 186333 INFO nova.virt.libvirt.driver [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Deleting instance files /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d_del
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.045 186333 INFO nova.virt.libvirt.driver [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Deletion of /var/lib/nova/instances/496a64d6-66b3-43ee-9e98-466ec3fd223d_del complete
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.482 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.485 186333 DEBUG oslo_concurrency.lockutils [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.485 186333 DEBUG nova.network.neutron [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Refreshing network info cache for port f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.557 186333 INFO nova.compute.manager [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Took 1.29 seconds to destroy the instance on the hypervisor.
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.558 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.558 186333 DEBUG nova.compute.manager [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.558 186333 DEBUG nova.network.neutron [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.558 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.729 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:13:59 compute-0 podman[196599]: time="2025-12-05T06:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:13:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:13:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2578 "" "Go-http-client/1.1"
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.772 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.772 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.772 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.772 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] No waiting events found dispatching network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.773 186333 DEBUG oslo_concurrency.lockutils [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.774 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] No waiting events found dispatching network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.774 186333 DEBUG nova.compute.manager [req-f658428d-adef-4478-96cb-17d0512d52de req-5e6e3ba8-0886-4208-92f7-fb95e7fcc3fd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-unplugged-7871a5b7-b713-4fab-810f-37a03f953665 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:13:59 compute-0 nova_compute[186329]: 2025-12-05 06:13:59.991 186333 WARNING neutronclient.v2_0.client [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.022 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.024 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.024 186333 INFO nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Creating image(s)
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.025 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.069 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.070 186333 DEBUG nova.virt.disk.api [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.071 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.124 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.124 186333 DEBUG nova.virt.disk.api [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.270 186333 WARNING neutronclient.v2_0.client [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.400 186333 DEBUG nova.network.neutron [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updated VIF entry in instance network info cache for port f2156bd9-041c-4b4a-8a98-32fa8b5b95a0. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.400 186333 DEBUG nova.network.neutron [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating instance_info_cache with network_info: [{"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.434 186333 DEBUG nova.network.neutron [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.631 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.632 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Ensure instance console log exists: /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.632 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.632 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.633 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.634 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Start _get_guest_xml network_info=[{"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.637 186333 WARNING nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.638 186333 DEBUG nova.virt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-845294095', uuid='2f8d28d2-73a5-43e3-84d2-0e117d02d93a'), owner=OwnerMeta(userid='e5df4001be694d0a80e0436f215d8a10', username='tempest-TestExecuteActionsViaActuator-2083413180-project-admin', projectid='5e3ec7864ec74a8e9a98ea7d30769fb0', projectname='tempest-TestExecuteActionsViaActuator-2083413180'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.micro', flavorid='da5dfa03-1fbb-4783-97bb-d13dca9dd7f2', memory_mb=192, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915240.6385705) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.651 186333 DEBUG nova.virt.libvirt.host [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.651 186333 DEBUG nova.virt.libvirt.host [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.654 186333 DEBUG nova.virt.libvirt.host [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.654 186333 DEBUG nova.virt.libvirt.host [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.655 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.656 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='da5dfa03-1fbb-4783-97bb-d13dca9dd7f2',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.656 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.656 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.656 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.656 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.657 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.657 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.657 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.657 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.657 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.658 186333 DEBUG nova.virt.hardware [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.660 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.702 186333 DEBUG oslo_concurrency.processutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.config --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.703 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.703 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.704 186333 DEBUG oslo_concurrency.lockutils [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.704 186333 DEBUG nova.virt.libvirt.vif [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-845294095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-845294095',id=4,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:13:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-cb412s8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:13:54Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=2f8d28d2-73a5-43e3-84d2-0e117d02d93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.705 186333 DEBUG nova.network.os_vif_util [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.705 186333 DEBUG nova.network.os_vif_util [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.707 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <uuid>2f8d28d2-73a5-43e3-84d2-0e117d02d93a</uuid>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <name>instance-00000004</name>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <memory>196608</memory>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-845294095</nova:name>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:14:00</nova:creationTime>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:flavor name="m1.micro" id="da5dfa03-1fbb-4783-97bb-d13dca9dd7f2">
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:memory>192</nova:memory>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_input_bus">usb</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_machine_type">q35</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_video_model">virtio</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:property name="hw_vif_model">virtio</nova:property>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:user uuid="e5df4001be694d0a80e0436f215d8a10">tempest-TestExecuteActionsViaActuator-2083413180-project-admin</nova:user>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:project uuid="5e3ec7864ec74a8e9a98ea7d30769fb0">tempest-TestExecuteActionsViaActuator-2083413180</nova:project>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         <nova:port uuid="f2156bd9-041c-4b4a-8a98-32fa8b5b95a0">
Dec 05 06:14:00 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <system>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="serial">2f8d28d2-73a5-43e3-84d2-0e117d02d93a</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="uuid">2f8d28d2-73a5-43e3-84d2-0e117d02d93a</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </system>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <os>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </os>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <features>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </features>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/disk.config"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:d4:b7:2f"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <target dev="tapf2156bd9-04"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a/console.log" append="off"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <video>
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </video>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:14:00 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:14:00 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:14:00 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:14:00 compute-0 nova_compute[186329]: </domain>
Dec 05 06:14:00 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.709 186333 DEBUG nova.virt.libvirt.vif [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-845294095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-845294095',id=4,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:13:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-cb412s8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:13:54Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=2f8d28d2-73a5-43e3-84d2-0e117d02d93a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.710 186333 DEBUG nova.network.os_vif_util [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "vif_mac": "fa:16:3e:d4:b7:2f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.710 186333 DEBUG nova.network.os_vif_util [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.711 186333 DEBUG os_vif [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.712 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.712 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.712 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.713 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.713 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '366022db-af6f-521b-8f38-a3c1818143d1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.714 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.715 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.717 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2156bd9-04, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.717 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf2156bd9-04, col_values=(('qos', UUID('6b265fad-e25c-4b6e-93d1-2358fe928e23')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.718 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf2156bd9-04, col_values=(('external_ids', {'iface-id': 'f2156bd9-041c-4b4a-8a98-32fa8b5b95a0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:b7:2f', 'vm-uuid': '2f8d28d2-73a5-43e3-84d2-0e117d02d93a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.719 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 NetworkManager[55434]: <info>  [1764915240.7198] manager: (tapf2156bd9-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.722 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.723 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.723 186333 INFO os_vif [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04')
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.905 186333 DEBUG oslo_concurrency.lockutils [req-ce5d3234-f7d2-40b7-bafa-6ebeb1d953b5 req-e3c06e5b-7743-460d-b740-8bb0be11f119 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-2f8d28d2-73a5-43e3-84d2-0e117d02d93a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:14:00 compute-0 nova_compute[186329]: 2025-12-05 06:14:00.938 186333 INFO nova.compute.manager [-] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Took 1.38 seconds to deallocate network for instance.
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: ERROR   06:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: ERROR   06:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: ERROR   06:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: ERROR   06:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: ERROR   06:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:14:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:14:01 compute-0 nova_compute[186329]: 2025-12-05 06:14:01.463 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:01 compute-0 nova_compute[186329]: 2025-12-05 06:14:01.464 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:01 compute-0 nova_compute[186329]: 2025-12-05 06:14:01.508 186333 DEBUG nova.compute.provider_tree [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:14:01 compute-0 nova_compute[186329]: 2025-12-05 06:14:01.815 186333 DEBUG nova.compute.manager [req-ea659b14-6289-46a4-b3c8-834f7b20cdc2 req-a45a5045-2830-46de-8733-d545767198f8 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 496a64d6-66b3-43ee-9e98-466ec3fd223d] Received event network-vif-deleted-7871a5b7-b713-4fab-810f-37a03f953665 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.012 186333 DEBUG nova.scheduler.client.report [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.257 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.258 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.258 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No VIF found with MAC fa:16:3e:d4:b7:2f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.259 186333 INFO nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Using config drive
Dec 05 06:14:02 compute-0 kernel: tapf2156bd9-04: entered promiscuous mode
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.3105] manager: (tapf2156bd9-04): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Dec 05 06:14:02 compute-0 ovn_controller[95223]: 2025-12-05T06:14:02Z|00057|binding|INFO|Claiming lport f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for this chassis.
Dec 05 06:14:02 compute-0 ovn_controller[95223]: 2025-12-05T06:14:02Z|00058|binding|INFO|f2156bd9-041c-4b4a-8a98-32fa8b5b95a0: Claiming fa:16:3e:d4:b7:2f 10.100.0.9
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.316 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.318 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:2f 10.100.0.9'], port_security=['fa:16:3e:d4:b7:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f8d28d2-73a5-43e3-84d2-0e117d02d93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '9', 'neutron:security_group_ids': '4eeb11fa-86ff-4478-bf3d-491b18f116e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78329d2c-9d75-49ec-bc81-89ddda82582c, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.319 104041 INFO neutron.agent.ovn.metadata.agent [-] Port f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 in datapath a2dd081e-dc20-4351-ac59-ccdb3568905c bound to our chassis
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.320 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:14:02 compute-0 ovn_controller[95223]: 2025-12-05T06:14:02Z|00059|binding|INFO|Setting lport f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 up in Southbound
Dec 05 06:14:02 compute-0 ovn_controller[95223]: 2025-12-05T06:14:02Z|00060|binding|INFO|Setting lport f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 ovn-installed in OVS
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.329 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.336 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[367b77c4-c09a-4ff0-9c29-4d3f0894241f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.337 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2dd081e-d1 in ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.336 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.339 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2dd081e-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.339 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6efefbb9-94e2-4d43-adf5-5cc983a96bb9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.340 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2c1783-3a89-435c-af23-1a64757f375a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 systemd-udevd[208207]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.350 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[08732f31-a1dd-439c-871e-86d7cae1de51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.357 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d9b9988e-1434-4193-b6ce-9c85f14de1cd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.3614] device (tapf2156bd9-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.3622] device (tapf2156bd9-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:14:02 compute-0 systemd-machined[152967]: New machine qemu-3-instance-00000004.
Dec 05 06:14:02 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000004.
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.387 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3166c0-8726-49a8-adf7-a2fd1f366b26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.391 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9858d2f6-f2d8-4ce8-b0be-b5f9bcfd5f76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.3918] manager: (tapa2dd081e-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.421 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[fc693908-67f4-4f98-bdec-4ce9165cb635]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.424 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[17634e4b-855e-435e-a1d7-a6f0f879efc9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.4514] device (tapa2dd081e-d0): carrier: link connected
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.457 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[53283623-7800-47f5-b424-ffb23d8d0832]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.473 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5680c636-dd02-4d08-a7aa-15b876764f0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2dd081e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:b0:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312098, 'reachable_time': 36157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208234, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.488 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d6fc40bf-5ff1-4ef9-982c-06663824ba91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:b0d1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 312098, 'tstamp': 312098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208235, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.505 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a41f3eff-533a-415b-aed1-db0cfa7713d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2dd081e-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:b0:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312098, 'reachable_time': 36157, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208236, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.519 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.055s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.537 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef5b128-00ce-4e3a-9845-5a269d50b2ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.545 186333 INFO nova.scheduler.client.report [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Deleted allocations for instance 496a64d6-66b3-43ee-9e98-466ec3fd223d
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.599 186333 DEBUG nova.compute.manager [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.602 186333 INFO nova.virt.libvirt.driver [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Instance running successfully.
Dec 05 06:14:02 compute-0 virtqemud[186605]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 05 06:14:02 compute-0 virtqemud[186605]: hostname: compute-0
Dec 05 06:14:02 compute-0 virtqemud[186605]: argument unsupported: QEMU guest agent is not configured
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.603 186333 DEBUG nova.virt.libvirt.guest [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.604 186333 DEBUG nova.virt.libvirt.driver [None req-b97a1698-b823-4bed-82d1-9235265a3f51 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.617 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[36570b03-ec92-4c31-b21a-092f1ed2da0c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.618 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2dd081e-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.619 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.619 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2dd081e-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:02 compute-0 kernel: tapa2dd081e-d0: entered promiscuous mode
Dec 05 06:14:02 compute-0 NetworkManager[55434]: <info>  [1764915242.6213] manager: (tapa2dd081e-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.620 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.625 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2dd081e-d0, col_values=(('external_ids', {'iface-id': '476e02c7-6c32-4db2-b808-685677ca76e8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.626 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 ovn_controller[95223]: 2025-12-05T06:14:02Z|00061|binding|INFO|Releasing lport 476e02c7-6c32-4db2-b808-685677ca76e8 from this chassis (sb_readonly=0)
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.628 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d9df2183-c76d-487c-9ae6-2991e48ebed4]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a2dd081e-dc20-4351-ac59-ccdb3568905c disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8468f2ef-05da-43ad-8046-0eaf4d42ea7a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.630 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.631 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcb94df-8afc-4096-8701-d49695773eae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.631 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID a2dd081e-dc20-4351-ac59-ccdb3568905c
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:14:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:02.631 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'env', 'PROCESS_TAG=haproxy-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2dd081e-dc20-4351-ac59-ccdb3568905c.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:14:02 compute-0 nova_compute[186329]: 2025-12-05 06:14:02.645 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:02 compute-0 podman[208271]: 2025-12-05 06:14:02.95869066 +0000 UTC m=+0.040428251 container create 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:14:02 compute-0 systemd[1]: Started libpod-conmon-04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4.scope.
Dec 05 06:14:03 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:14:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7825de6413b6e8740f03babe3340636a22d25bd84dc5209645d2160f326cf32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:14:03 compute-0 podman[208271]: 2025-12-05 06:14:03.025561356 +0000 UTC m=+0.107298947 container init 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Dec 05 06:14:03 compute-0 podman[208271]: 2025-12-05 06:14:03.030278538 +0000 UTC m=+0.112016119 container start 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 05 06:14:03 compute-0 podman[208271]: 2025-12-05 06:14:02.938256451 +0000 UTC m=+0.019994053 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:14:03 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [NOTICE]   (208288) : New worker (208290) forked
Dec 05 06:14:03 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [NOTICE]   (208288) : Loading success.
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.572 186333 DEBUG oslo_concurrency.lockutils [None req-357d9bbc-b62e-4ceb-93c2-df3ae01ade29 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "496a64d6-66b3-43ee-9e98-466ec3fd223d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.820s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.603 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.865 186333 DEBUG nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.865 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.865 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 DEBUG nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 WARNING nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state resized and task_state deleting.
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 DEBUG nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.866 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.867 186333 DEBUG oslo_concurrency.lockutils [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.867 186333 DEBUG nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:14:03 compute-0 nova_compute[186329]: 2025-12-05 06:14:03.867 186333 WARNING nova.compute.manager [req-60b2bacf-f356-43b5-95ee-520015ea6a26 req-fa9e130a-785e-4033-bf60-17412fe09c36 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-plugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state resized and task_state deleting.
Dec 05 06:14:04 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Dec 05 06:14:04 compute-0 systemd[208065]: Activating special unit Exit the Session...
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped target Main User Target.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped target Basic System.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped target Paths.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped target Sockets.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped target Timers.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 05 06:14:04 compute-0 systemd[208065]: Closed D-Bus User Message Bus Socket.
Dec 05 06:14:04 compute-0 systemd[208065]: Stopped Create User's Volatile Files and Directories.
Dec 05 06:14:04 compute-0 systemd[208065]: Removed slice User Application Slice.
Dec 05 06:14:04 compute-0 systemd[208065]: Reached target Shutdown.
Dec 05 06:14:04 compute-0 systemd[208065]: Finished Exit the Session.
Dec 05 06:14:04 compute-0 systemd[208065]: Reached target Exit the Session.
Dec 05 06:14:04 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Dec 05 06:14:04 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Dec 05 06:14:04 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec 05 06:14:04 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec 05 06:14:04 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec 05 06:14:04 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec 05 06:14:04 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Dec 05 06:14:05 compute-0 nova_compute[186329]: 2025-12-05 06:14:05.720 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:08 compute-0 nova_compute[186329]: 2025-12-05 06:14:08.609 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:10 compute-0 podman[208298]: 2025-12-05 06:14:10.480643381 +0000 UTC m=+0.060135002 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:14:10 compute-0 podman[208297]: 2025-12-05 06:14:10.537225549 +0000 UTC m=+0.114943815 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:14:10 compute-0 nova_compute[186329]: 2025-12-05 06:14:10.723 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.274 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.275 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.275 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.276 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.276 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.285 186333 INFO nova.compute.manager [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Terminating instance
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.793 186333 DEBUG nova.compute.manager [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:14:12 compute-0 kernel: tapf2156bd9-04 (unregistering): left promiscuous mode
Dec 05 06:14:12 compute-0 NetworkManager[55434]: <info>  [1764915252.8168] device (tapf2156bd9-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.825 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:12 compute-0 ovn_controller[95223]: 2025-12-05T06:14:12Z|00062|binding|INFO|Releasing lport f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 from this chassis (sb_readonly=0)
Dec 05 06:14:12 compute-0 ovn_controller[95223]: 2025-12-05T06:14:12Z|00063|binding|INFO|Setting lport f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 down in Southbound
Dec 05 06:14:12 compute-0 ovn_controller[95223]: 2025-12-05T06:14:12Z|00064|binding|INFO|Removing iface tapf2156bd9-04 ovn-installed in OVS
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.831 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:12.834 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d4:b7:2f 10.100.0.9'], port_security=['fa:16:3e:d4:b7:2f 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f8d28d2-73a5-43e3-84d2-0e117d02d93a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5e3ec7864ec74a8e9a98ea7d30769fb0', 'neutron:revision_number': '10', 'neutron:security_group_ids': '4eeb11fa-86ff-4478-bf3d-491b18f116e0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=78329d2c-9d75-49ec-bc81-89ddda82582c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:14:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:12.834 104041 INFO neutron.agent.ovn.metadata.agent [-] Port f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 in datapath a2dd081e-dc20-4351-ac59-ccdb3568905c unbound from our chassis
Dec 05 06:14:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:12.835 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2dd081e-dc20-4351-ac59-ccdb3568905c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:14:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:12.839 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e9006f76-8efb-4327-84bd-85971002419a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:12.840 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c namespace which is not needed anymore
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.842 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:12 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 05 06:14:12 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000004.scope: Consumed 10.158s CPU time.
Dec 05 06:14:12 compute-0 systemd-machined[152967]: Machine qemu-3-instance-00000004 terminated.
Dec 05 06:14:12 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [NOTICE]   (208288) : haproxy version is 3.0.5-8e879a5
Dec 05 06:14:12 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [NOTICE]   (208288) : path to executable is /usr/sbin/haproxy
Dec 05 06:14:12 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [WARNING]  (208288) : Exiting Master process...
Dec 05 06:14:12 compute-0 podman[208370]: 2025-12-05 06:14:12.941894935 +0000 UTC m=+0.026728507 container kill 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:14:12 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [ALERT]    (208288) : Current worker (208290) exited with code 143 (Terminated)
Dec 05 06:14:12 compute-0 neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c[208284]: [WARNING]  (208288) : All workers exited. Exiting... (0)
Dec 05 06:14:12 compute-0 systemd[1]: libpod-04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4.scope: Deactivated successfully.
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.969 186333 DEBUG nova.compute.manager [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.969 186333 DEBUG oslo_concurrency.lockutils [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.970 186333 DEBUG oslo_concurrency.lockutils [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.970 186333 DEBUG oslo_concurrency.lockutils [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.970 186333 DEBUG nova.compute.manager [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:14:12 compute-0 nova_compute[186329]: 2025-12-05 06:14:12.970 186333 WARNING nova.compute.manager [req-d8f6b64c-9a8b-4eca-959d-4a8852c38e02 req-8f11e432-71f8-4fe3-b5e4-e5bf5df224e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state active and task_state None.
Dec 05 06:14:12 compute-0 podman[208383]: 2025-12-05 06:14:12.988722999 +0000 UTC m=+0.026346458 container died 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:14:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4-userdata-shm.mount: Deactivated successfully.
Dec 05 06:14:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7825de6413b6e8740f03babe3340636a22d25bd84dc5209645d2160f326cf32-merged.mount: Deactivated successfully.
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.013 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.016 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 podman[208383]: 2025-12-05 06:14:13.021692832 +0000 UTC m=+0.059316271 container cleanup 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:14:13 compute-0 systemd[1]: libpod-conmon-04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4.scope: Deactivated successfully.
Dec 05 06:14:13 compute-0 podman[208384]: 2025-12-05 06:14:13.04133408 +0000 UTC m=+0.076648516 container remove 04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.045 186333 INFO nova.virt.libvirt.driver [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Instance destroyed successfully.
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.046 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[585e2238-92e6-4760-aa9d-ff7bfbd1fc74]: (4, ("Fri Dec  5 06:14:12 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c (04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4)\n04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4\nFri Dec  5 06:14:12 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c (04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4)\n04b397db08092ff8cc7ad89a625f858e52f234daa4641800f4b23d9623d7e6f4\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.046 186333 DEBUG nova.objects.instance [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lazy-loading 'resources' on Instance uuid 2f8d28d2-73a5-43e3-84d2-0e117d02d93a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.048 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[571f8a1f-f371-42df-80cc-a0e4d7d680db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.048 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2dd081e-dc20-4351-ac59-ccdb3568905c.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.049 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[66b967af-7fad-4bec-855a-0e79f6f4a410]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.049 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2dd081e-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.050 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 kernel: tapa2dd081e-d0: left promiscuous mode
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.063 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.068 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.069 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e49ab73b-88a0-41a6-86d9-55c0baeda348]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.077 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[52905d0b-4301-47be-b090-73550e2c375c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.077 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[432b1d2d-c323-4c41-9e1d-a3b7b3159a6a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.090 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ea078b52-c3be-4d88-a6b2-780ca62c2e0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 312091, 'reachable_time': 40536, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208424, 'error': None, 'target': 'ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 systemd[1]: run-netns-ovnmeta\x2da2dd081e\x2ddc20\x2d4351\x2dac59\x2dccdb3568905c.mount: Deactivated successfully.
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.093 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2dd081e-dc20-4351-ac59-ccdb3568905c deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:14:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:13.094 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[fcdedf81-5804-48c7-ac8d-547467c5a19c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.552 186333 DEBUG nova.virt.libvirt.vif [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:12:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-845294095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-845294095',id=4,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:14:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5e3ec7864ec74a8e9a98ea7d30769fb0',ramdisk_id='',reservation_id='r-cb412s8d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2083413180',owner_user_name='tempest-TestExecuteActionsViaActuator-2083413180-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:14:03Z,user_data=None,user_id='e5df4001be694d0a80e0436f215d8a10',uuid=2f8d28d2-73a5-43e3-84d2-0e117d02d93a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.552 186333 DEBUG nova.network.os_vif_util [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converting VIF {"id": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "address": "fa:16:3e:d4:b7:2f", "network": {"id": "a2dd081e-dc20-4351-ac59-ccdb3568905c", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1604933201-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d726452471114afe8e8cd3a437713a5d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2156bd9-04", "ovs_interfaceid": "f2156bd9-041c-4b4a-8a98-32fa8b5b95a0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.552 186333 DEBUG nova.network.os_vif_util [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.553 186333 DEBUG os_vif [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.554 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.555 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2156bd9-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.556 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.557 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.558 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.558 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6b265fad-e25c-4b6e-93d1-2358fe928e23) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.558 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.559 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.561 186333 INFO os_vif [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:b7:2f,bridge_name='br-int',has_traffic_filtering=True,id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0,network=Network(a2dd081e-dc20-4351-ac59-ccdb3568905c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2156bd9-04')
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.561 186333 INFO nova.virt.libvirt.driver [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Deleting instance files /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a_del
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.565 186333 INFO nova.virt.libvirt.driver [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Deletion of /var/lib/nova/instances/2f8d28d2-73a5-43e3-84d2-0e117d02d93a_del complete
Dec 05 06:14:13 compute-0 nova_compute[186329]: 2025-12-05 06:14:13.610 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.074 186333 INFO nova.compute.manager [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Took 1.28 seconds to destroy the instance on the hypervisor.
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.074 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.075 186333 DEBUG nova.compute.manager [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.075 186333 DEBUG nova.network.neutron [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.075 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:14:14 compute-0 nova_compute[186329]: 2025-12-05 06:14:14.442 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.072 186333 DEBUG nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.073 186333 DEBUG oslo_concurrency.lockutils [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.073 186333 DEBUG oslo_concurrency.lockutils [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.073 186333 DEBUG oslo_concurrency.lockutils [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.073 186333 DEBUG nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] No waiting events found dispatching network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.073 186333 WARNING nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received unexpected event network-vif-unplugged-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 for instance with vm_state active and task_state None.
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.074 186333 DEBUG nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Received event network-vif-deleted-f2156bd9-041c-4b4a-8a98-32fa8b5b95a0 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.074 186333 INFO nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Neutron deleted interface f2156bd9-041c-4b4a-8a98-32fa8b5b95a0; detaching it from the instance and deleting it from the info cache
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.074 186333 DEBUG nova.network.neutron [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.149 186333 DEBUG nova.network.neutron [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.581 186333 DEBUG nova.compute.manager [req-f54612ad-430e-4537-879c-a01f1df987b5 req-27a2424d-38ea-48a9-8c1e-fbd15d5ae56c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Detach interface failed, port_id=f2156bd9-041c-4b4a-8a98-32fa8b5b95a0, reason: Instance 2f8d28d2-73a5-43e3-84d2-0e117d02d93a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:14:15 compute-0 nova_compute[186329]: 2025-12-05 06:14:15.653 186333 INFO nova.compute.manager [-] [instance: 2f8d28d2-73a5-43e3-84d2-0e117d02d93a] Took 1.58 seconds to deallocate network for instance.
Dec 05 06:14:16 compute-0 nova_compute[186329]: 2025-12-05 06:14:16.165 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:16 compute-0 nova_compute[186329]: 2025-12-05 06:14:16.165 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:16 compute-0 nova_compute[186329]: 2025-12-05 06:14:16.171 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:16 compute-0 nova_compute[186329]: 2025-12-05 06:14:16.206 186333 INFO nova.scheduler.client.report [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Deleted allocations for instance 2f8d28d2-73a5-43e3-84d2-0e117d02d93a
Dec 05 06:14:17 compute-0 nova_compute[186329]: 2025-12-05 06:14:17.223 186333 DEBUG oslo_concurrency.lockutils [None req-319e7dcb-7d8e-4471-abb6-77086d474062 e5df4001be694d0a80e0436f215d8a10 5e3ec7864ec74a8e9a98ea7d30769fb0 - - default default] Lock "2f8d28d2-73a5-43e3-84d2-0e117d02d93a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.948s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:18 compute-0 podman[208426]: 2025-12-05 06:14:18.465133506 +0000 UTC m=+0.050075065 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:14:18 compute-0 podman[208428]: 2025-12-05 06:14:18.477651398 +0000 UTC m=+0.059238316 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:14:18 compute-0 podman[208427]: 2025-12-05 06:14:18.50058687 +0000 UTC m=+0.084294527 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7)
Dec 05 06:14:18 compute-0 nova_compute[186329]: 2025-12-05 06:14:18.559 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:18 compute-0 nova_compute[186329]: 2025-12-05 06:14:18.611 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:21 compute-0 nova_compute[186329]: 2025-12-05 06:14:21.678 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:23 compute-0 nova_compute[186329]: 2025-12-05 06:14:23.560 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:23 compute-0 nova_compute[186329]: 2025-12-05 06:14:23.614 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:28 compute-0 nova_compute[186329]: 2025-12-05 06:14:28.562 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:28 compute-0 nova_compute[186329]: 2025-12-05 06:14:28.615 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:29.494 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:29.494 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:29.494 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:29 compute-0 podman[196599]: time="2025-12-05T06:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:14:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:14:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Dec 05 06:14:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:30.645 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:b5:0e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6630eec87f9847c89bfa1dcb9f3ce845', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f3c6a57-7d9b-47e3-a5c4-019f1f878562, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a409630f-3010-48f8-9d63-6150be086210) old=Port_Binding(mac=['fa:16:3e:1e:b5:0e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6630eec87f9847c89bfa1dcb9f3ce845', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:14:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:30.646 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a409630f-3010-48f8-9d63-6150be086210 in datapath a344c69a-1809-4075-a0ff-98bf8b4cfc94 updated
Dec 05 06:14:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:30.647 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a344c69a-1809-4075-a0ff-98bf8b4cfc94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:14:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:30.648 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[da94e3e1-3e34-4126-9397-bc299add8673]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: ERROR   06:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: ERROR   06:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: ERROR   06:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: ERROR   06:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: ERROR   06:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:14:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:14:33 compute-0 nova_compute[186329]: 2025-12-05 06:14:33.563 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:33 compute-0 nova_compute[186329]: 2025-12-05 06:14:33.617 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:35 compute-0 nova_compute[186329]: 2025-12-05 06:14:35.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:38.513 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:c5:af 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c17a5359-8395-4b1e-b86d-0b749acb4127', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c17a5359-8395-4b1e-b86d-0b749acb4127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e60b6b7be5bd497c8a89b66b86c083f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1085ad15-cb8c-4ff3-8c12-9a49572c3ecc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6697e5d9-a2a4-4c53-b05b-94afbf365c61) old=Port_Binding(mac=['fa:16:3e:6a:c5:af'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c17a5359-8395-4b1e-b86d-0b749acb4127', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c17a5359-8395-4b1e-b86d-0b749acb4127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e60b6b7be5bd497c8a89b66b86c083f5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:14:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:38.513 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6697e5d9-a2a4-4c53-b05b-94afbf365c61 in datapath c17a5359-8395-4b1e-b86d-0b749acb4127 updated
Dec 05 06:14:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:38.514 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c17a5359-8395-4b1e-b86d-0b749acb4127, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:14:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:38.514 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7e81c0-3f55-4b30-98a7-fba03c1f413a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:14:38 compute-0 nova_compute[186329]: 2025-12-05 06:14:38.565 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:38 compute-0 nova_compute[186329]: 2025-12-05 06:14:38.618 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:41 compute-0 podman[208480]: 2025-12-05 06:14:41.458116717 +0000 UTC m=+0.041160177 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:14:41 compute-0 podman[208479]: 2025-12-05 06:14:41.475436748 +0000 UTC m=+0.060507933 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:14:42 compute-0 nova_compute[186329]: 2025-12-05 06:14:42.209 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:42 compute-0 nova_compute[186329]: 2025-12-05 06:14:42.714 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.227 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.227 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.227 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.227 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.418 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.419 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.436 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.437 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=73.17048263549805GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.437 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.437 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.567 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:43 compute-0 nova_compute[186329]: 2025-12-05 06:14:43.621 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:44 compute-0 nova_compute[186329]: 2025-12-05 06:14:44.474 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:14:44 compute-0 nova_compute[186329]: 2025-12-05 06:14:44.475 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:14:43 up 52 min,  0 user,  load average: 0.27, 0.28, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:14:44 compute-0 nova_compute[186329]: 2025-12-05 06:14:44.489 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:14:44 compute-0 nova_compute[186329]: 2025-12-05 06:14:44.993 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:14:45 compute-0 nova_compute[186329]: 2025-12-05 06:14:45.500 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:14:45 compute-0 nova_compute[186329]: 2025-12-05 06:14:45.500 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:14:45 compute-0 nova_compute[186329]: 2025-12-05 06:14:45.501 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:45 compute-0 nova_compute[186329]: 2025-12-05 06:14:45.501 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.214 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.216 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.216 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.216 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:14:46 compute-0 nova_compute[186329]: 2025-12-05 06:14:46.719 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:14:48 compute-0 nova_compute[186329]: 2025-12-05 06:14:48.569 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:48 compute-0 nova_compute[186329]: 2025-12-05 06:14:48.622 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:49 compute-0 nova_compute[186329]: 2025-12-05 06:14:49.213 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:49 compute-0 podman[208525]: 2025-12-05 06:14:49.459476647 +0000 UTC m=+0.045097767 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:14:49 compute-0 podman[208527]: 2025-12-05 06:14:49.463431373 +0000 UTC m=+0.046141380 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:14:49 compute-0 podman[208526]: 2025-12-05 06:14:49.489446559 +0000 UTC m=+0.073873374 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 06:14:52 compute-0 nova_compute[186329]: 2025-12-05 06:14:52.119 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:14:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:52.692 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:14:52 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:52.693 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:14:52 compute-0 nova_compute[186329]: 2025-12-05 06:14:52.693 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:53 compute-0 nova_compute[186329]: 2025-12-05 06:14:53.571 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:53 compute-0 nova_compute[186329]: 2025-12-05 06:14:53.623 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:54 compute-0 ovn_controller[95223]: 2025-12-05T06:14:54Z|00065|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 06:14:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:14:57.694 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:14:58 compute-0 nova_compute[186329]: 2025-12-05 06:14:58.573 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:58 compute-0 nova_compute[186329]: 2025-12-05 06:14:58.624 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:14:59 compute-0 podman[196599]: time="2025-12-05T06:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:14:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:14:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2578 "" "Go-http-client/1.1"
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: ERROR   06:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: ERROR   06:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: ERROR   06:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: ERROR   06:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: ERROR   06:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:15:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:15:03 compute-0 nova_compute[186329]: 2025-12-05 06:15:03.575 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:03 compute-0 nova_compute[186329]: 2025-12-05 06:15:03.625 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:08 compute-0 nova_compute[186329]: 2025-12-05 06:15:08.269 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:08 compute-0 nova_compute[186329]: 2025-12-05 06:15:08.269 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:08 compute-0 nova_compute[186329]: 2025-12-05 06:15:08.577 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:08 compute-0 nova_compute[186329]: 2025-12-05 06:15:08.626 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:08 compute-0 nova_compute[186329]: 2025-12-05 06:15:08.772 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:15:09 compute-0 nova_compute[186329]: 2025-12-05 06:15:09.304 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:09 compute-0 nova_compute[186329]: 2025-12-05 06:15:09.304 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:09 compute-0 nova_compute[186329]: 2025-12-05 06:15:09.309 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:15:09 compute-0 nova_compute[186329]: 2025-12-05 06:15:09.309 186333 INFO nova.compute.claims [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:15:10 compute-0 nova_compute[186329]: 2025-12-05 06:15:10.345 186333 DEBUG nova.compute.provider_tree [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:15:10 compute-0 nova_compute[186329]: 2025-12-05 06:15:10.849 186333 DEBUG nova.scheduler.client.report [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.356 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.052s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.358 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.865 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.866 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.866 186333 WARNING neutronclient.v2_0.client [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:11 compute-0 nova_compute[186329]: 2025-12-05 06:15:11.866 186333 WARNING neutronclient.v2_0.client [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:12 compute-0 nova_compute[186329]: 2025-12-05 06:15:12.371 186333 INFO nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:15:12 compute-0 podman[208579]: 2025-12-05 06:15:12.483496332 +0000 UTC m=+0.063711005 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:15:12 compute-0 podman[208580]: 2025-12-05 06:15:12.490530548 +0000 UTC m=+0.068401945 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:15:12 compute-0 nova_compute[186329]: 2025-12-05 06:15:12.713 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Successfully created port: be6221fb-4c19-44a2-8009-3bb6e449bfec _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:15:12 compute-0 nova_compute[186329]: 2025-12-05 06:15:12.875 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.579 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.601 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Successfully updated port: be6221fb-4c19-44a2-8009-3bb6e449bfec _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.627 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.644 186333 DEBUG nova.compute.manager [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-changed-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.644 186333 DEBUG nova.compute.manager [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Refreshing instance network info cache due to event network-changed-be6221fb-4c19-44a2-8009-3bb6e449bfec. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.644 186333 DEBUG oslo_concurrency.lockutils [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.644 186333 DEBUG oslo_concurrency.lockutils [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.645 186333 DEBUG nova.network.neutron [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Refreshing network info cache for port be6221fb-4c19-44a2-8009-3bb6e449bfec _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.887 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.888 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.888 186333 INFO nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Creating image(s)
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.889 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.889 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.890 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.890 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.892 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.894 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.935 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.936 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.936 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.937 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.939 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.940 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.979 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.980 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.998 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.999 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.062s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:13 compute-0 nova_compute[186329]: 2025-12-05 06:15:13.999 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.040 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.041 186333 DEBUG nova.virt.disk.api [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Checking if we can resize image /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.041 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.083 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.083 186333 DEBUG nova.virt.disk.api [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Cannot resize image /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.084 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.084 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Ensure instance console log exists: /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.084 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.085 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.085 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.105 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.148 186333 WARNING neutronclient.v2_0.client [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.454 186333 DEBUG nova.network.neutron [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:15:14 compute-0 nova_compute[186329]: 2025-12-05 06:15:14.555 186333 DEBUG nova.network.neutron [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:15:15 compute-0 nova_compute[186329]: 2025-12-05 06:15:15.059 186333 DEBUG oslo_concurrency.lockutils [req-b19b47c0-549b-4449-8b60-73f12b7d610f req-5b23e6bc-597a-4268-8d08-58baf441747d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:15:15 compute-0 nova_compute[186329]: 2025-12-05 06:15:15.060 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquired lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:15:15 compute-0 nova_compute[186329]: 2025-12-05 06:15:15.060 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:15:16 compute-0 nova_compute[186329]: 2025-12-05 06:15:16.447 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:15:16 compute-0 nova_compute[186329]: 2025-12-05 06:15:16.631 186333 WARNING neutronclient.v2_0.client [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:16 compute-0 nova_compute[186329]: 2025-12-05 06:15:16.735 186333 DEBUG nova.network.neutron [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating instance_info_cache with network_info: [{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.238 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Releasing lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.239 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance network_info: |[{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.240 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Start _get_guest_xml network_info=[{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.244 186333 WARNING nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.245 186333 DEBUG nova.virt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1599472882', uuid='b1120a6b-8463-4925-b4b0-3ebf3845041a'), owner=OwnerMeta(userid='bcefd7e9a4ec4993ad72b4790a5f4624', username='tempest-TestExecuteBasicStrategy-429784391-project-admin', projectid='e60b6b7be5bd497c8a89b66b86c083f5', projectname='tempest-TestExecuteBasicStrategy-429784391'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915317.2452903) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.249 186333 DEBUG nova.virt.libvirt.host [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.250 186333 DEBUG nova.virt.libvirt.host [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.252 186333 DEBUG nova.virt.libvirt.host [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.253 186333 DEBUG nova.virt.libvirt.host [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.254 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.254 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.254 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.254 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.255 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.256 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.256 186333 DEBUG nova.virt.hardware [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.258 186333 DEBUG nova.virt.libvirt.vif [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:15:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1599472882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1599472882',id=7,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e60b6b7be5bd497c8a89b66b86c083f5',ramdisk_id='',reservation_id='r-z288xhi6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-429784391',owner_user_name='tempest-TestExecuteBasicStrategy-429784391-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:15:12Z,user_data=None,user_id='bcefd7e9a4ec4993ad72b4790a5f4624',uuid=b1120a6b-8463-4925-b4b0-3ebf3845041a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.259 186333 DEBUG nova.network.os_vif_util [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converting VIF {"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.259 186333 DEBUG nova.network.os_vif_util [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.260 186333 DEBUG nova.objects.instance [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid b1120a6b-8463-4925-b4b0-3ebf3845041a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.764 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <uuid>b1120a6b-8463-4925-b4b0-3ebf3845041a</uuid>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <name>instance-00000007</name>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1599472882</nova:name>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:15:17</nova:creationTime>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:15:17 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:15:17 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:user uuid="bcefd7e9a4ec4993ad72b4790a5f4624">tempest-TestExecuteBasicStrategy-429784391-project-admin</nova:user>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:project uuid="e60b6b7be5bd497c8a89b66b86c083f5">tempest-TestExecuteBasicStrategy-429784391</nova:project>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         <nova:port uuid="be6221fb-4c19-44a2-8009-3bb6e449bfec">
Dec 05 06:15:17 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <system>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="serial">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="uuid">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </system>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <os>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </os>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <features>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </features>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:3d:88:99"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <target dev="tapbe6221fb-4c"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <video>
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </video>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:15:17 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:15:17 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:15:17 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:15:17 compute-0 nova_compute[186329]: </domain>
Dec 05 06:15:17 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.765 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Preparing to wait for external event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.765 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.766 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.766 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.766 186333 DEBUG nova.virt.libvirt.vif [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:15:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1599472882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1599472882',id=7,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e60b6b7be5bd497c8a89b66b86c083f5',ramdisk_id='',reservation_id='r-z288xhi6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-429784391',owner_user_name='tempest-TestExecuteBasicStrategy-429784391-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:15:12Z,user_data=None,user_id='bcefd7e9a4ec4993ad72b4790a5f4624',uuid=b1120a6b-8463-4925-b4b0-3ebf3845041a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.767 186333 DEBUG nova.network.os_vif_util [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converting VIF {"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.767 186333 DEBUG nova.network.os_vif_util [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.767 186333 DEBUG os_vif [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.768 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.768 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.768 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.769 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.769 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '7a328d55-37c1-5f2d-9060-36de134054d9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.770 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.773 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.775 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.775 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe6221fb-4c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.775 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbe6221fb-4c, col_values=(('qos', UUID('9686ab1c-a20f-4385-a6cd-84c37161b0a0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.776 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbe6221fb-4c, col_values=(('external_ids', {'iface-id': 'be6221fb-4c19-44a2-8009-3bb6e449bfec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3d:88:99', 'vm-uuid': 'b1120a6b-8463-4925-b4b0-3ebf3845041a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.776 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.778 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:15:17 compute-0 NetworkManager[55434]: <info>  [1764915317.7787] manager: (tapbe6221fb-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.781 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:17 compute-0 nova_compute[186329]: 2025-12-05 06:15:17.781 186333 INFO os_vif [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c')
Dec 05 06:15:18 compute-0 nova_compute[186329]: 2025-12-05 06:15:18.629 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.305 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.305 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.306 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] No VIF found with MAC fa:16:3e:3d:88:99, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.306 186333 INFO nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Using config drive
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.814 186333 WARNING neutronclient.v2_0.client [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.914 186333 INFO nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Creating config drive at /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config
Dec 05 06:15:19 compute-0 nova_compute[186329]: 2025-12-05 06:15:19.919 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmprf940fkf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.035 186333 DEBUG oslo_concurrency.processutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmprf940fkf" returned: 0 in 0.116s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:20 compute-0 kernel: tapbe6221fb-4c: entered promiscuous mode
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.0953] manager: (tapbe6221fb-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Dec 05 06:15:20 compute-0 ovn_controller[95223]: 2025-12-05T06:15:20Z|00066|binding|INFO|Claiming lport be6221fb-4c19-44a2-8009-3bb6e449bfec for this chassis.
Dec 05 06:15:20 compute-0 ovn_controller[95223]: 2025-12-05T06:15:20Z|00067|binding|INFO|be6221fb-4c19-44a2-8009-3bb6e449bfec: Claiming fa:16:3e:3d:88:99 10.100.0.11
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.100 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.102 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:88:99 10.100.0.11'], port_security=['fa:16:3e:3d:88:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1120a6b-8463-4925-b4b0-3ebf3845041a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e60b6b7be5bd497c8a89b66b86c083f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7386e586-83f2-448e-bac2-9982d533a1d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f3c6a57-7d9b-47e3-a5c4-019f1f878562, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=be6221fb-4c19-44a2-8009-3bb6e449bfec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.103 104041 INFO neutron.agent.ovn.metadata.agent [-] Port be6221fb-4c19-44a2-8009-3bb6e449bfec in datapath a344c69a-1809-4075-a0ff-98bf8b4cfc94 bound to our chassis
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.103 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a344c69a-1809-4075-a0ff-98bf8b4cfc94
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.119 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[93513ac6-4888-41a8-963d-e0c972323177]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.119 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa344c69a-11 in ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.121 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa344c69a-10 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.121 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d4973c-bd4f-4949-82c0-da7ecdcc48fe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.121 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8fb5c7-5513-463f-896c-198d10713bda]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 systemd-udevd[208680]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.141 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cb0da2-1bf1-4574-af29-b4097f6f1c0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 systemd-machined[152967]: New machine qemu-4-instance-00000007.
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.1616] device (tapbe6221fb-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.1625] device (tapbe6221fb-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.164 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb1be59-13f0-4cc3-a38c-a3bcd7d4aedc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000007.
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.176 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 ovn_controller[95223]: 2025-12-05T06:15:20Z|00068|binding|INFO|Setting lport be6221fb-4c19-44a2-8009-3bb6e449bfec ovn-installed in OVS
Dec 05 06:15:20 compute-0 ovn_controller[95223]: 2025-12-05T06:15:20Z|00069|binding|INFO|Setting lport be6221fb-4c19-44a2-8009-3bb6e449bfec up in Southbound
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.182 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.187 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[c75bdc8e-75cb-448a-8e39-b80c84d497f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.193 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6867cd4c-4cd3-46e3-aa81-2d7e5d82a77d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.1945] manager: (tapa344c69a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Dec 05 06:15:20 compute-0 systemd-udevd[208695]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:15:20 compute-0 podman[208653]: 2025-12-05 06:15:20.200006936 +0000 UTC m=+0.111383072 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.223 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1b987c-2220-4e72-8c8b-f9b51aa39aaf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.225 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ae9392-3e0a-4777-bbb1-9d8e7b377a9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 podman[208651]: 2025-12-05 06:15:20.231769509 +0000 UTC m=+0.146482990 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent)
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.2421] device (tapa344c69a-10): carrier: link connected
Dec 05 06:15:20 compute-0 podman[208652]: 2025-12-05 06:15:20.244488144 +0000 UTC m=+0.159106035 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.246 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0643d9-5912-4aab-91e2-5cd3d66e40c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.259 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[70ac1fca-f889-43f5-8132-79a17e5d20c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa344c69a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:b5:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319877, 'reachable_time': 32041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208736, 'error': None, 'target': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.272 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f61bca-3c03-43d5-b44e-e6e26bb076f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:b50e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 319877, 'tstamp': 319877}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208737, 'error': None, 'target': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.286 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7f19fae1-066f-4a64-a4c6-80e616bc619b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa344c69a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:b5:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319877, 'reachable_time': 32041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208738, 'error': None, 'target': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.304 186333 DEBUG nova.compute.manager [req-4ad266b5-1229-4e02-a2f1-6463393c50ab req-7cc160d1-4ccd-4d22-a755-8bd92d4a971b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.305 186333 DEBUG oslo_concurrency.lockutils [req-4ad266b5-1229-4e02-a2f1-6463393c50ab req-7cc160d1-4ccd-4d22-a755-8bd92d4a971b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.305 186333 DEBUG oslo_concurrency.lockutils [req-4ad266b5-1229-4e02-a2f1-6463393c50ab req-7cc160d1-4ccd-4d22-a755-8bd92d4a971b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.305 186333 DEBUG oslo_concurrency.lockutils [req-4ad266b5-1229-4e02-a2f1-6463393c50ab req-7cc160d1-4ccd-4d22-a755-8bd92d4a971b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.306 186333 DEBUG nova.compute.manager [req-4ad266b5-1229-4e02-a2f1-6463393c50ab req-7cc160d1-4ccd-4d22-a755-8bd92d4a971b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Processing event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.313 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4db93ddf-c03e-4691-82e9-e907c8ffbb20]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.353 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e731453b-e620-4b9d-a727-c512ea97be96]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.354 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa344c69a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.355 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.355 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa344c69a-10, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:20 compute-0 kernel: tapa344c69a-10: entered promiscuous mode
Dec 05 06:15:20 compute-0 NetworkManager[55434]: <info>  [1764915320.3592] manager: (tapa344c69a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.358 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.362 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa344c69a-10, col_values=(('external_ids', {'iface-id': 'a409630f-3010-48f8-9d63-6150be086210'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:20 compute-0 ovn_controller[95223]: 2025-12-05T06:15:20Z|00070|binding|INFO|Releasing lport a409630f-3010-48f8-9d63-6150be086210 from this chassis (sb_readonly=0)
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.365 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.366 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[530f1a0d-f72c-47f6-b3e7-343006213fa8]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.366 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.367 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.367 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a344c69a-1809-4075-a0ff-98bf8b4cfc94 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.367 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.367 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[53bc41c6-922d-4e2a-8fb2-90e85d687f76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.368 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.368 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[72005be9-4b30-436d-95fb-173fcb60d0d4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.369 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-a344c69a-1809-4075-a0ff-98bf8b4cfc94
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID a344c69a-1809-4075-a0ff-98bf8b4cfc94
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:15:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:20.369 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'env', 'PROCESS_TAG=haproxy-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a344c69a-1809-4075-a0ff-98bf8b4cfc94.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.374 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:20 compute-0 podman[208766]: 2025-12-05 06:15:20.672530122 +0000 UTC m=+0.031122269 container create 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true)
Dec 05 06:15:20 compute-0 systemd[1]: Started libpod-conmon-9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8.scope.
Dec 05 06:15:20 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:15:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05267951eb060f87af14245fe8864155cdd66618e1e4b20a620dbc226437ac7e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:15:20 compute-0 podman[208766]: 2025-12-05 06:15:20.722974902 +0000 UTC m=+0.081567049 container init 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:15:20 compute-0 podman[208766]: 2025-12-05 06:15:20.735394265 +0000 UTC m=+0.093986412 container start 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:15:20 compute-0 podman[208766]: 2025-12-05 06:15:20.658678527 +0000 UTC m=+0.017270694 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:15:20 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [NOTICE]   (208788) : New worker (208790) forked
Dec 05 06:15:20 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [NOTICE]   (208788) : Loading success.
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.790 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.792 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.794 186333 INFO nova.virt.libvirt.driver [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance spawned successfully.
Dec 05 06:15:20 compute-0 nova_compute[186329]: 2025-12-05 06:15:20.795 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.302 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.303 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.303 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.303 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.304 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.304 186333 DEBUG nova.virt.libvirt.driver [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.810 186333 INFO nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Took 7.92 seconds to spawn the instance on the hypervisor.
Dec 05 06:15:21 compute-0 nova_compute[186329]: 2025-12-05 06:15:21.810 186333 DEBUG nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.329 186333 INFO nova.compute.manager [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Took 13.05 seconds to build instance.
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.352 186333 DEBUG nova.compute.manager [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.352 186333 DEBUG oslo_concurrency.lockutils [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.352 186333 DEBUG oslo_concurrency.lockutils [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.352 186333 DEBUG oslo_concurrency.lockutils [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.352 186333 DEBUG nova.compute.manager [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No waiting events found dispatching network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.353 186333 WARNING nova.compute.manager [req-084cc4ae-2f15-48da-abd4-a787817b0d60 req-3b166618-da91-474f-bbdc-9f84e9702ab1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received unexpected event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with vm_state active and task_state None.
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.778 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:22 compute-0 nova_compute[186329]: 2025-12-05 06:15:22.832 186333 DEBUG oslo_concurrency.lockutils [None req-54ac1c41-b6ea-408c-8712-8c71c262b59f bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.563s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:23 compute-0 nova_compute[186329]: 2025-12-05 06:15:23.629 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:27 compute-0 nova_compute[186329]: 2025-12-05 06:15:27.779 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:28 compute-0 nova_compute[186329]: 2025-12-05 06:15:28.632 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:29.495 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:29.497 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:29.498 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:29 compute-0 podman[196599]: time="2025-12-05T06:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:15:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:15:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3039 "" "Go-http-client/1.1"
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: ERROR   06:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: ERROR   06:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: ERROR   06:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: ERROR   06:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: ERROR   06:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:15:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:15:32 compute-0 nova_compute[186329]: 2025-12-05 06:15:32.782 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:33 compute-0 ovn_controller[95223]: 2025-12-05T06:15:33Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3d:88:99 10.100.0.11
Dec 05 06:15:33 compute-0 ovn_controller[95223]: 2025-12-05T06:15:33Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3d:88:99 10.100.0.11
Dec 05 06:15:33 compute-0 nova_compute[186329]: 2025-12-05 06:15:33.634 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:37 compute-0 nova_compute[186329]: 2025-12-05 06:15:37.585 186333 DEBUG nova.compute.manager [None req-542e8519-a893-4c21-a15d-ddec189d2f6c e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:636
Dec 05 06:15:37 compute-0 nova_compute[186329]: 2025-12-05 06:15:37.622 186333 DEBUG nova.compute.provider_tree [None req-542e8519-a893-4c21-a15d-ddec189d2f6c e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Updating resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb generation from 4 to 8 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 05 06:15:37 compute-0 nova_compute[186329]: 2025-12-05 06:15:37.784 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:38 compute-0 nova_compute[186329]: 2025-12-05 06:15:38.636 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:42 compute-0 nova_compute[186329]: 2025-12-05 06:15:42.786 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:43 compute-0 podman[208807]: 2025-12-05 06:15:43.462532872 +0000 UTC m=+0.046469767 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:15:43 compute-0 podman[208806]: 2025-12-05 06:15:43.490548489 +0000 UTC m=+0.074124196 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 06:15:43 compute-0 nova_compute[186329]: 2025-12-05 06:15:43.638 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:43 compute-0 nova_compute[186329]: 2025-12-05 06:15:43.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:43 compute-0 nova_compute[186329]: 2025-12-05 06:15:43.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:15:44 compute-0 nova_compute[186329]: 2025-12-05 06:15:44.399 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Check if temp file /var/lib/nova/instances/tmpyj3mzo5y exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Dec 05 06:15:44 compute-0 nova_compute[186329]: 2025-12-05 06:15:44.402 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj3mzo5y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1120a6b-8463-4925-b4b0-3ebf3845041a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9298
Dec 05 06:15:44 compute-0 nova_compute[186329]: 2025-12-05 06:15:44.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:44 compute-0 nova_compute[186329]: 2025-12-05 06:15:44.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:45 compute-0 nova_compute[186329]: 2025-12-05 06:15:45.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:45 compute-0 nova_compute[186329]: 2025-12-05 06:15:45.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:45 compute-0 nova_compute[186329]: 2025-12-05 06:15:45.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:45 compute-0 nova_compute[186329]: 2025-12-05 06:15:45.220 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.249 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.291 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.292 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.332 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.519 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.520 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.535 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.536 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.14143371582031GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.536 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:46 compute-0 nova_compute[186329]: 2025-12-05 06:15:46.536 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.550 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating resource usage from migration 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.606 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.606 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.606 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:15:46 up 53 min,  0 user,  load average: 0.43, 0.31, 0.36\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_e60b6b7be5bd497c8a89b66b86c083f5': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.693 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:15:47 compute-0 nova_compute[186329]: 2025-12-05 06:15:47.789 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.197 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.640 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.653 186333 DEBUG oslo_concurrency.processutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.696 186333 DEBUG oslo_concurrency.processutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.696 186333 DEBUG oslo_concurrency.processutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.703 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.704 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.739 186333 DEBUG oslo_concurrency.processutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.739 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Preparing to wait for external event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.740 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.740 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:48 compute-0 nova_compute[186329]: 2025-12-05 06:15:48.740 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:49 compute-0 nova_compute[186329]: 2025-12-05 06:15:49.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:49 compute-0 nova_compute[186329]: 2025-12-05 06:15:49.706 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:49 compute-0 nova_compute[186329]: 2025-12-05 06:15:49.706 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:49 compute-0 nova_compute[186329]: 2025-12-05 06:15:49.706 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:50 compute-0 ovn_controller[95223]: 2025-12-05T06:15:50Z|00071|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec 05 06:15:50 compute-0 podman[208864]: 2025-12-05 06:15:50.473040832 +0000 UTC m=+0.052047014 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent)
Dec 05 06:15:50 compute-0 podman[208865]: 2025-12-05 06:15:50.482349194 +0000 UTC m=+0.058876645 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, build-date=2025-08-20T13:12:41)
Dec 05 06:15:50 compute-0 podman[208866]: 2025-12-05 06:15:50.503511104 +0000 UTC m=+0.077361071 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4)
Dec 05 06:15:50 compute-0 nova_compute[186329]: 2025-12-05 06:15:50.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:15:52 compute-0 nova_compute[186329]: 2025-12-05 06:15:52.792 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:53 compute-0 nova_compute[186329]: 2025-12-05 06:15:53.642 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.661 186333 DEBUG nova.compute.manager [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.662 186333 DEBUG oslo_concurrency.lockutils [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.662 186333 DEBUG oslo_concurrency.lockutils [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.662 186333 DEBUG oslo_concurrency.lockutils [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.662 186333 DEBUG nova.compute.manager [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No event matching network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec in dict_keys([('network-vif-plugged', 'be6221fb-4c19-44a2-8009-3bb6e449bfec')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:350
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.662 186333 DEBUG nova.compute.manager [req-7387daad-c0f2-4684-8384-8c495f16ab6e req-71559d33-168b-4225-b2aa-6c23352f5f9d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:15:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:54.830 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:15:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:54.831 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:15:54 compute-0 nova_compute[186329]: 2025-12-05 06:15:54.832 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:55 compute-0 nova_compute[186329]: 2025-12-05 06:15:55.755 186333 INFO nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Took 7.01 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.711 186333 DEBUG nova.compute.manager [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.711 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.711 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG nova.compute.manager [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Processing event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG nova.compute.manager [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-changed-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG nova.compute.manager [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Refreshing instance network info cache due to event network-changed-be6221fb-4c19-44a2-8009-3bb6e449bfec. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.712 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.713 186333 DEBUG nova.network.neutron [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Refreshing network info cache for port be6221fb-4c19-44a2-8009-3bb6e449bfec _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:15:56 compute-0 nova_compute[186329]: 2025-12-05 06:15:56.714 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:15:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:56.832 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.218 186333 WARNING neutronclient.v2_0.client [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.221 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj3mzo5y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1120a6b-8463-4925-b4b0-3ebf3845041a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9663
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.731 186333 DEBUG nova.objects.instance [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid b1120a6b-8463-4925-b4b0-3ebf3845041a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.732 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.733 186333 WARNING neutronclient.v2_0.client [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.735 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.736 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.795 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.856 186333 DEBUG nova.network.neutron [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updated VIF entry in instance network info cache for port be6221fb-4c19-44a2-8009-3bb6e449bfec. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Dec 05 06:15:57 compute-0 nova_compute[186329]: 2025-12-05 06:15:57.857 186333 DEBUG nova.network.neutron [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating instance_info_cache with network_info: [{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.239 186333 DEBUG nova.virt.libvirt.vif [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:15:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1599472882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1599472882',id=7,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:15:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e60b6b7be5bd497c8a89b66b86c083f5',ramdisk_id='',reservation_id='r-z288xhi6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-429784391',owner_user_name='tempest-TestExecuteBasicStrategy-429784391-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:15:21Z,user_data=None,user_id='bcefd7e9a4ec4993ad72b4790a5f4624',uuid=b1120a6b-8463-4925-b4b0-3ebf3845041a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.239 186333 DEBUG nova.network.os_vif_util [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.240 186333 DEBUG nova.network.os_vif_util [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.240 186333 DEBUG nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating guest XML with vif config: <interface type="ethernet">
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <mac address="fa:16:3e:3d:88:99"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <model type="virtio"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <mtu size="1442"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <target dev="tapbe6221fb-4c"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]: </interface>
Dec 05 06:15:58 compute-0 nova_compute[186329]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.241 186333 DEBUG nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <name>instance-00000007</name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <uuid>b1120a6b-8463-4925-b4b0-3ebf3845041a</uuid>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1599472882</nova:name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:15:17</nova:creationTime>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:user uuid="bcefd7e9a4ec4993ad72b4790a5f4624">tempest-TestExecuteBasicStrategy-429784391-project-admin</nova:user>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:project uuid="e60b6b7be5bd497c8a89b66b86c083f5">tempest-TestExecuteBasicStrategy-429784391</nova:project>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:port uuid="be6221fb-4c19-44a2-8009-3bb6e449bfec">
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <memory unit="KiB">131072</memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <vcpu placement="static">1</vcpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <partition>/machine</partition>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="serial">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="uuid">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <vmcoreinfo state="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact" check="partial">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <model fallback="allow">Nehalem</model>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_poweroff>destroy</on_poweroff>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_reboot>restart</on_reboot>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_crash>destroy</on_crash>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <readonly/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="1" port="0x10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="2" port="0x11"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="3" port="0x12"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="4" port="0x13"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="5" port="0x14"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="6" port="0x15"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="7" port="0x16"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="8" port="0x17"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="9" port="0x18"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="10" port="0x19"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="11" port="0x1a"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="12" port="0x1b"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="13" port="0x1c"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="14" port="0x1d"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="15" port="0x1e"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="16" port="0x1f"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="17" port="0x20"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="18" port="0x21"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="19" port="0x22"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="20" port="0x23"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="21" port="0x24"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="22" port="0x25"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="23" port="0x26"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="24" port="0x27"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="25" port="0x28"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-pci-bridge"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="sata" index="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <interface type="ethernet"><mac address="fa:16:3e:3d:88:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbe6221fb-4c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </interface><serial type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="isa-serial" port="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <model name="isa-serial"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </target>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <console type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="serial" port="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </console>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="usb" bus="0" port="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </input>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="mouse" bus="ps2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <listen type="address" address="::"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model type="virtio" heads="1" primary="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]: </domain>
Dec 05 06:15:58 compute-0 nova_compute[186329]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.241 186333 DEBUG nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <name>instance-00000007</name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <uuid>b1120a6b-8463-4925-b4b0-3ebf3845041a</uuid>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1599472882</nova:name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:15:17</nova:creationTime>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:user uuid="bcefd7e9a4ec4993ad72b4790a5f4624">tempest-TestExecuteBasicStrategy-429784391-project-admin</nova:user>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:project uuid="e60b6b7be5bd497c8a89b66b86c083f5">tempest-TestExecuteBasicStrategy-429784391</nova:project>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:port uuid="be6221fb-4c19-44a2-8009-3bb6e449bfec">
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <memory unit="KiB">131072</memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <vcpu placement="static">1</vcpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <partition>/machine</partition>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="serial">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="uuid">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <vmcoreinfo state="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact" check="partial">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <model fallback="allow">Nehalem</model>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_poweroff>destroy</on_poweroff>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_reboot>restart</on_reboot>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_crash>destroy</on_crash>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <readonly/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="1" port="0x10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="2" port="0x11"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="3" port="0x12"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="4" port="0x13"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="5" port="0x14"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="6" port="0x15"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="7" port="0x16"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="8" port="0x17"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="9" port="0x18"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="10" port="0x19"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="11" port="0x1a"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="12" port="0x1b"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="13" port="0x1c"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="14" port="0x1d"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="15" port="0x1e"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="16" port="0x1f"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="17" port="0x20"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="18" port="0x21"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="19" port="0x22"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="20" port="0x23"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="21" port="0x24"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="22" port="0x25"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="23" port="0x26"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="24" port="0x27"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="25" port="0x28"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-pci-bridge"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="sata" index="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <interface type="ethernet"><mac address="fa:16:3e:3d:88:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbe6221fb-4c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </interface><serial type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="isa-serial" port="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <model name="isa-serial"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </target>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <console type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="serial" port="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </console>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="usb" bus="0" port="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </input>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="mouse" bus="ps2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <listen type="address" address="::"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model type="virtio" heads="1" primary="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]: </domain>
Dec 05 06:15:58 compute-0 nova_compute[186329]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.241 186333 DEBUG nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] _update_pci_xml output xml=<domain type="kvm">
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <name>instance-00000007</name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <uuid>b1120a6b-8463-4925-b4b0-3ebf3845041a</uuid>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1599472882</nova:name>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:15:17</nova:creationTime>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:user uuid="bcefd7e9a4ec4993ad72b4790a5f4624">tempest-TestExecuteBasicStrategy-429784391-project-admin</nova:user>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:project uuid="e60b6b7be5bd497c8a89b66b86c083f5">tempest-TestExecuteBasicStrategy-429784391</nova:project>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <nova:port uuid="be6221fb-4c19-44a2-8009-3bb6e449bfec">
Dec 05 06:15:58 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <memory unit="KiB">131072</memory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <currentMemory unit="KiB">131072</currentMemory>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <vcpu placement="static">1</vcpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <partition>/machine</partition>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </resource>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="serial">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="uuid">b1120a6b-8463-4925-b4b0-3ebf3845041a</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </system>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="pc-q35-rhel9.8.0">hvm</type>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </os>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <vmcoreinfo state="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </features>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact" check="partial">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <model fallback="allow">Nehalem</model>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_poweroff>destroy</on_poweroff>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_reboot>restart</on_reboot>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <on_crash>destroy</on_crash>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/disk.config"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <readonly/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="0" model="pcie-root"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="1" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="1" port="0x10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="2" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="2" port="0x11"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="3" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="3" port="0x12"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="4" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="4" port="0x13"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="5" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="5" port="0x14"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="6" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="6" port="0x15"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="7" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="7" port="0x16"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="8" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="8" port="0x17"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="9" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="9" port="0x18"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="10" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="10" port="0x19"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="11" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="11" port="0x1a"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="12" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="12" port="0x1b"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="13" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="13" port="0x1c"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="14" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="14" port="0x1d"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="15" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="15" port="0x1e"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="16" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="16" port="0x1f"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="17" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="17" port="0x20"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="18" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="18" port="0x21"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="19" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="19" port="0x22"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="20" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="20" port="0x23"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="21" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="21" port="0x24"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="22" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="22" port="0x25"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="23" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="23" port="0x26"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="24" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="24" port="0x27"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="25" model="pcie-root-port">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-root-port"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target chassis="25" port="0x28"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model name="pcie-pci-bridge"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="usb" index="0" model="piix3-uhci">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <controller type="sata" index="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </controller>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <interface type="ethernet"><mac address="fa:16:3e:3d:88:99"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbe6221fb-4c"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </interface><serial type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="isa-serial" port="0">
Dec 05 06:15:58 compute-0 nova_compute[186329]:         <model name="isa-serial"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       </target>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <console type="pty">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a/console.log" append="off"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <target type="serial" port="0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </console>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="usb" bus="0" port="1"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </input>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <input type="mouse" bus="ps2"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <listen type="address" address="::"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </graphics>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <model type="virtio" heads="1" primary="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </video>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:15:58 compute-0 nova_compute[186329]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:15:58 compute-0 nova_compute[186329]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Dec 05 06:15:58 compute-0 nova_compute[186329]: </domain>
Dec 05 06:15:58 compute-0 nova_compute[186329]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.241 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.242 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.242 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.361 186333 DEBUG oslo_concurrency.lockutils [req-f8223300-d817-46d2-ab3b-44aaef1cd928 req-48ac191e-dec1-4e00-a1ae-2d862ceab989 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-b1120a6b-8463-4925-b4b0-3ebf3845041a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.644 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.669 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.669 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.670 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.670 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.670 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.677 186333 INFO nova.compute.manager [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Terminating instance
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.744 186333 DEBUG nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Dec 05 06:15:58 compute-0 nova_compute[186329]: 2025-12-05 06:15:58.744 186333 INFO nova.virt.libvirt.migration [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Increasing downtime to 50 ms after 0 sec elapsed time
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.188 186333 DEBUG nova.compute.manager [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:15:59 compute-0 kernel: tapbe6221fb-4c (unregistering): left promiscuous mode
Dec 05 06:15:59 compute-0 NetworkManager[55434]: <info>  [1764915359.2129] device (tapbe6221fb-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:15:59 compute-0 ovn_controller[95223]: 2025-12-05T06:15:59Z|00072|binding|INFO|Releasing lport be6221fb-4c19-44a2-8009-3bb6e449bfec from this chassis (sb_readonly=0)
Dec 05 06:15:59 compute-0 ovn_controller[95223]: 2025-12-05T06:15:59Z|00073|binding|INFO|Setting lport be6221fb-4c19-44a2-8009-3bb6e449bfec down in Southbound
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.219 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 ovn_controller[95223]: 2025-12-05T06:15:59Z|00074|binding|INFO|Removing iface tapbe6221fb-4c ovn-installed in OVS
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.233 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:88:99 10.100.0.11'], port_security=['fa:16:3e:3d:88:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1120a6b-8463-4925-b4b0-3ebf3845041a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e60b6b7be5bd497c8a89b66b86c083f5', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7386e586-83f2-448e-bac2-9982d533a1d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2f3c6a57-7d9b-47e3-a5c4-019f1f878562, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=be6221fb-4c19-44a2-8009-3bb6e449bfec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.234 104041 INFO neutron.agent.ovn.metadata.agent [-] Port be6221fb-4c19-44a2-8009-3bb6e449bfec in datapath a344c69a-1809-4075-a0ff-98bf8b4cfc94 unbound from our chassis
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.235 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.235 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a344c69a-1809-4075-a0ff-98bf8b4cfc94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.236 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[331c4d96-f422-42df-b093-9ff2ad04cf7b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.237 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94 namespace which is not needed anymore
Dec 05 06:15:59 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 05 06:15:59 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000007.scope: Consumed 12.399s CPU time.
Dec 05 06:15:59 compute-0 systemd-machined[152967]: Machine qemu-4-instance-00000007 terminated.
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.309 186333 DEBUG nova.compute.manager [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.309 186333 DEBUG oslo_concurrency.lockutils [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.310 186333 DEBUG oslo_concurrency.lockutils [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.310 186333 DEBUG oslo_concurrency.lockutils [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.310 186333 DEBUG nova.compute.manager [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No waiting events found dispatching network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.311 186333 DEBUG nova.compute.manager [req-5b984195-b655-4127-a85f-c6ccefc39917 req-2cb2ecca-709e-4390-9863-b7db599be727 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:15:59 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [NOTICE]   (208788) : haproxy version is 3.0.5-8e879a5
Dec 05 06:15:59 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [NOTICE]   (208788) : path to executable is /usr/sbin/haproxy
Dec 05 06:15:59 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [WARNING]  (208788) : Exiting Master process...
Dec 05 06:15:59 compute-0 podman[208950]: 2025-12-05 06:15:59.329405315 +0000 UTC m=+0.022512521 container kill 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 06:15:59 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [ALERT]    (208788) : Current worker (208790) exited with code 143 (Terminated)
Dec 05 06:15:59 compute-0 neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94[208778]: [WARNING]  (208788) : All workers exited. Exiting... (0)
Dec 05 06:15:59 compute-0 systemd[1]: libpod-9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8.scope: Deactivated successfully.
Dec 05 06:15:59 compute-0 podman[208962]: 2025-12-05 06:15:59.363944419 +0000 UTC m=+0.019883978 container died 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:15:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8-userdata-shm.mount: Deactivated successfully.
Dec 05 06:15:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-05267951eb060f87af14245fe8864155cdd66618e1e4b20a620dbc226437ac7e-merged.mount: Deactivated successfully.
Dec 05 06:15:59 compute-0 podman[208962]: 2025-12-05 06:15:59.382383238 +0000 UTC m=+0.038322798 container cleanup 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2)
Dec 05 06:15:59 compute-0 systemd[1]: libpod-conmon-9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8.scope: Deactivated successfully.
Dec 05 06:15:59 compute-0 podman[208964]: 2025-12-05 06:15:59.390971527 +0000 UTC m=+0.040308814 container remove 9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.394 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[89a310fc-0c41-4faa-bf6b-90bbdca71fcb]: (4, ("Fri Dec  5 06:15:59 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94 (9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8)\n9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8\nFri Dec  5 06:15:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94 (9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8)\n9cdaa4a5c49eee793d291154a78fdf848b08c7a544c1d59321fd953a7b1cb1e8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.397 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad7ac4d-1d79-4536-a8ef-79d8f26e0fcd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.397 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a344c69a-1809-4075-a0ff-98bf8b4cfc94.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.398 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dde903ed-ebae-4741-9ca1-f645887089e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.398 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa344c69a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.400 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.415 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 kernel: tapa344c69a-10: left promiscuous mode
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.419 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.421 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[838cc711-a4f8-445d-83f7-dd07653a3c66]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.429 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e6055a-80ba-4b8c-8b56-7f8638e0c1ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.429 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[39c5d60d-e164-42f9-8d80-2220a57c75b2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.442 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cfda6463-5235-4872-963b-e811322c2988]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 319872, 'reachable_time': 25442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209002, 'error': None, 'target': 'ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 systemd[1]: run-netns-ovnmeta\x2da344c69a\x2d1809\x2d4075\x2da0ff\x2d98bf8b4cfc94.mount: Deactivated successfully.
Dec 05 06:15:59 compute-0 virtqemud[186605]: operation failed: domain is not running
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.446 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a344c69a-1809-4075-a0ff-98bf8b4cfc94 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:15:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:15:59.446 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[a3658ab8-45ae-477b-b7dc-f5bca76e180f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.449 186333 INFO nova.virt.libvirt.driver [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Instance destroyed successfully.
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.450 186333 DEBUG nova.objects.instance [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lazy-loading 'resources' on Instance uuid b1120a6b-8463-4925-b4b0-3ebf3845041a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:15:59 compute-0 virtqemud[186605]: operation failed: job 'migration in' failed: load of migration failed: Input/output error
Dec 05 06:15:59 compute-0 virtqemud[186605]: operation failed: domain 'instance-00000007' is not running
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.689 186333 ERROR nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Live Migration failure: operation failed: domain is not running: libvirt.libvirtError: operation failed: domain is not running
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.690 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Dec 05 06:15:59 compute-0 podman[196599]: time="2025-12-05T06:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:15:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:15:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2574 "" "Go-http-client/1.1"
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.754 186333 INFO nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.954 186333 DEBUG nova.virt.libvirt.vif [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:15:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1599472882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1599472882',id=7,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:15:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e60b6b7be5bd497c8a89b66b86c083f5',ramdisk_id='',reservation_id='r-z288xhi6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-429784391',owner_user_name='tempest-TestExecuteBasicStrategy-429784391-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:15:39Z,user_data=None,user_id='bcefd7e9a4ec4993ad72b4790a5f4624',uuid=b1120a6b-8463-4925-b4b0-3ebf3845041a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.954 186333 DEBUG nova.network.os_vif_util [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converting VIF {"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.955 186333 DEBUG nova.network.os_vif_util [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.955 186333 DEBUG os_vif [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.956 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.957 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe6221fb-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.957 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.959 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.959 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.959 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=9686ab1c-a20f-4385-a6cd-84c37161b0a0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.960 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.961 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.962 186333 INFO os_vif [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c')
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.963 186333 INFO nova.virt.libvirt.driver [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Deleting instance files /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a_del
Dec 05 06:15:59 compute-0 nova_compute[186329]: 2025-12-05 06:15:59.963 186333 INFO nova.virt.libvirt.driver [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Deletion of /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a_del complete
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.256 186333 DEBUG nova.virt.libvirt.guest [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.256 186333 INFO nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Migration operation has completed
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.257 186333 INFO nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] _post_live_migration() is started..
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.263 186333 WARNING neutronclient.v2_0.client [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.263 186333 WARNING neutronclient.v2_0.client [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.472 186333 INFO nova.compute.manager [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Took 1.28 seconds to destroy the instance on the hypervisor.
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.472 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.473 186333 DEBUG nova.compute.manager [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.473 186333 DEBUG nova.network.neutron [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.473 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.549 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.761 186333 DEBUG nova.network.neutron [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Activated binding for port be6221fb-4c19-44a2-8009-3bb6e449bfec and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.761 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10063
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.762 186333 DEBUG nova.virt.libvirt.vif [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:15:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1599472882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1599472882',id=7,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:15:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e60b6b7be5bd497c8a89b66b86c083f5',ramdisk_id='',reservation_id='r-z288xhi6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-429784391',owner_user_name='tempest-TestExecuteBasicStrategy-429784391-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:15:58Z,user_data=None,user_id='bcefd7e9a4ec4993ad72b4790a5f4624',uuid=b1120a6b-8463-4925-b4b0-3ebf3845041a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.762 186333 DEBUG nova.network.os_vif_util [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "address": "fa:16:3e:3d:88:99", "network": {"id": "a344c69a-1809-4075-a0ff-98bf8b4cfc94", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1863291907-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6630eec87f9847c89bfa1dcb9f3ce845", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe6221fb-4c", "ovs_interfaceid": "be6221fb-4c19-44a2-8009-3bb6e449bfec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.762 186333 DEBUG nova.network.os_vif_util [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.763 186333 DEBUG os_vif [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.764 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.764 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe6221fb-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.764 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.765 186333 INFO os_vif [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3d:88:99,bridge_name='br-int',has_traffic_filtering=True,id=be6221fb-4c19-44a2-8009-3bb6e449bfec,network=Network(a344c69a-1809-4075-a0ff-98bf8b4cfc94),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe6221fb-4c')
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.766 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.766 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.766 186333 DEBUG oslo_concurrency.lockutils [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.766 186333 DEBUG nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10086
Dec 05 06:16:00 compute-0 nova_compute[186329]: 2025-12-05 06:16:00.767 186333 INFO nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Deletion of /var/lib/nova/instances/b1120a6b-8463-4925-b4b0-3ebf3845041a_del complete
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.300 186333 DEBUG nova.network.neutron [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No waiting events found dispatching network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.349 186333 WARNING nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received unexpected event network-vif-plugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with vm_state active and task_state deleting.
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.350 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.350 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.350 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.350 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.350 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No waiting events found dispatching network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG oslo_concurrency.lockutils [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] No waiting events found dispatching network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.351 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-unplugged-be6221fb-4c19-44a2-8009-3bb6e449bfec for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.352 186333 DEBUG nova.compute.manager [req-15caa3a5-5fb8-474c-8725-5d7c70e8bd0d req-e4c138d1-c47e-4821-972b-1c7c96babb4b fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Received event network-vif-deleted-be6221fb-4c19-44a2-8009-3bb6e449bfec external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: ERROR   06:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: ERROR   06:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: ERROR   06:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: ERROR   06:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: ERROR   06:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:16:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:16:01 compute-0 nova_compute[186329]: 2025-12-05 06:16:01.804 186333 INFO nova.compute.manager [-] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Took 1.33 seconds to deallocate network for instance.
Dec 05 06:16:02 compute-0 nova_compute[186329]: 2025-12-05 06:16:02.314 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:02 compute-0 nova_compute[186329]: 2025-12-05 06:16:02.315 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:02 compute-0 nova_compute[186329]: 2025-12-05 06:16:02.350 186333 DEBUG nova.compute.provider_tree [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:16:02 compute-0 nova_compute[186329]: 2025-12-05 06:16:02.854 186333 DEBUG nova.scheduler.client.report [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:16:03 compute-0 nova_compute[186329]: 2025-12-05 06:16:03.360 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.045s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:03 compute-0 nova_compute[186329]: 2025-12-05 06:16:03.376 186333 INFO nova.scheduler.client.report [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Deleted allocations for instance b1120a6b-8463-4925-b4b0-3ebf3845041a
Dec 05 06:16:03 compute-0 nova_compute[186329]: 2025-12-05 06:16:03.645 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:04 compute-0 nova_compute[186329]: 2025-12-05 06:16:04.392 186333 DEBUG oslo_concurrency.lockutils [None req-1775505e-08d8-45ff-8881-feb42af62a17 bcefd7e9a4ec4993ad72b4790a5f4624 e60b6b7be5bd497c8a89b66b86c083f5 - - default default] Lock "b1120a6b-8463-4925-b4b0-3ebf3845041a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.723s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:04 compute-0 nova_compute[186329]: 2025-12-05 06:16:04.961 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Post live migration at destination compute-1.ctlplane.example.com failed: nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:05 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:05 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:05 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:05 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:16:05 compute-0 nova_compute[186329]:     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:16:05 compute-0 nova_compute[186329]:     rv = execute(f, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise e.with_traceback(tb)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:16:05 compute-0 nova_compute[186329]:     rv = meth(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: During handling of the above exception, another exception occurred:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception():
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self.driver.post_live_migration_at_destination(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self._reattach_instance_vifs(context, instance, network_info)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs
Dec 05 06:16:05 compute-0 nova_compute[186329]:     guest = self._host.get_guest(instance)
Dec 05 06:16:05 compute-0 nova_compute[186329]:             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:16:05 compute-0 nova_compute[186329]:                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: During handling of the above exception, another exception occurred:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:16:05 compute-0 nova_compute[186329]:     res = self.dispatcher.dispatch(message)
Dec 05 06:16:05 compute-0 nova_compute[186329]:           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]:     result = func(ctxt, **new_args)
Dec 05 06:16:05 compute-0 nova_compute[186329]:              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:05 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception():
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(self, context, *args, **kw)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return function(self, context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return function(self, context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]:     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     updates, result = self.indirection_api.object_action(
Dec 05 06:16:05 compute-0 nova_compute[186329]:                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:05 compute-0 nova_compute[186329]:     result = self.transport._send(
Dec 05 06:16:05 compute-0 nova_compute[186329]:              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return self._driver.send(target, ctxt, message,
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise result
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:05 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:05 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:05 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:05 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:05 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:05 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 
Dec 05 06:16:05 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10097, in _post_live_migration
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.compute_rpcapi.post_live_migration_at_destination(ctxt,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/rpcapi.py", line 932, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(ctxt, 'post_live_migration_at_destination',
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = execute(f, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise e.with_traceback(tb)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = meth(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.driver.post_live_migration_at_destination(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._reattach_instance_vifs(context, instance, network_info)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     guest = self._host.get_guest(instance)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     res = self.dispatcher.dispatch(message)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = func(ctxt, **new_args)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(self, context, *args, **kw)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     updates, result = self.indirection_api.object_action(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:05 compute-0 nova_compute[186329]: 2025-12-05 06:16:05.786 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:06 compute-0 nova_compute[186329]: 2025-12-05 06:16:06.337 186333 DEBUG nova.objects.instance [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 WARNING nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Error monitoring migration: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: : nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 
Dec 05 06:16:07 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9948, in _post_live_migration_update_host
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._post_live_migration(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(self, context, *args, **kw)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10097, in _post_live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.compute_rpcapi.post_live_migration_at_destination(ctxt,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/rpcapi.py", line 932, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(ctxt, 'post_live_migration_at_destination',
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = execute(f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise e.with_traceback(tb)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = meth(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.driver.post_live_migration_at_destination(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._reattach_instance_vifs(context, instance, network_info)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     guest = self._host.get_guest(instance)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     res = self.dispatcher.dispatch(message)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = func(ctxt, **new_args)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(self, context, *args, **kw)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     updates, result = self.indirection_api.object_action(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11545, in _live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._live_migration_monitor(context, instance, guest, dest,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11457, in _live_migration_monitor
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     post_method(context, instance, dest, block_migration,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9977, in _post_live_migration_update_host
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance.save()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     updates, result = self.indirection_api.object_action(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.347 186333 ERROR nova.virt.libvirt.driver [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 DEBUG nova.virt.libvirt.driver [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Live migration failed.: nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9948, in _post_live_migration_update_host
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._post_live_migration(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(self, context, *args, **kw)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10097, in _post_live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.compute_rpcapi.post_live_migration_at_destination(ctxt,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/rpcapi.py", line 932, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(ctxt, 'post_live_migration_at_destination',
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = execute(f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise e.with_traceback(tb)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = meth(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.driver.post_live_migration_at_destination(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._reattach_instance_vifs(context, instance, network_info)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     guest = self._host.get_guest(instance)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     res = self.dispatcher.dispatch(message)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = func(ctxt, **new_args)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception():
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(self, context, *args, **kw)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return function(self, context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     updates, result = self.indirection_api.object_action(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11545, in _live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._live_migration_monitor(context, instance, guest, dest,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11457, in _live_migration_monitor
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     post_method(context, instance, dest, block_migration,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9977, in _post_live_migration_update_host
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance.save()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     updates, result = self.indirection_api.object_action(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = self.transport._send(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._driver.send(target, ctxt, message,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise result
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return getattr(target, method)(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return fn(self, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.force_reraise()
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise self.value
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return f(context, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = execute(f, *args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise e.with_traceback(tb)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     rv = meth(*args, **kwargs)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] During handling of the above exception, another exception occurred:
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] Traceback (most recent call last):
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9665, in _do_live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.driver.live_migration(context, instance, dest,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11005, in live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self._live_migration(context, instance, dest,
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11560, in _live_migration
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     self.live_migration_abort(instance)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11016, in live_migration_abort
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     guest = self._host.get_guest(instance)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a]     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:07 compute-0 nova_compute[186329]: 2025-12-05 06:16:07.353 186333 ERROR nova.compute.manager [instance: b1120a6b-8463-4925-b4b0-3ebf3845041a] 
Dec 05 06:16:08 compute-0 nova_compute[186329]: 2025-12-05 06:16:08.367 186333 ERROR root [None req-6c1032ff-96c9-46ab-9dff-f1d705446089 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Original exception being dropped: ['Traceback (most recent call last):\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9948, in _post_live_migration_update_host\n    self._post_live_migration(\n', '  File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped\n    with excutils.save_and_reraise_exception():\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n', '  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n', '  File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped\n    return f(self, context, *args, **kw)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function\n    return function(self, context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10097, in _post_live_migration\n    self.compute_rpcapi.post_live_migration_at_destination(ctxt,\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/rpcapi.py", line 932, in post_live_migration_at_destination\n    return cctxt.call(ctxt, \'post_live_migration_at_destination\',\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call\n    result = self.transport._send(\n             ^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send\n    return self._driver.send(target, ctxt, message,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send\n    return self._send(target, ctxt, message, wait_for_reply, timeout,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send\n    raise result\n', 'nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch\n    return getattr(target, method)(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper\n    return fn(self, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save\n    old_ref, inst_ref = db.instance_update_and_get_original(\n                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper\n    with excutils.save_and_reraise_exception() as ectxt:\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper\n    return f(context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original\n    instance_ref = _instance_get_by_uuid(context, instance_uuid,\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid\n    raise exception.InstanceNotFound(instance_id=uuid)\n\nnova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\n\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain\n    return conn.lookupByUUIDString(instance.uuid)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit\n    result = proxy_call(self._autowrap, f, *args, **kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call\n    rv = execute(f, *args, **kwargs)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute\n    raise e.with_traceback(tb)\n\n  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker\n    rv = meth(*args, **kwargs)\n         ^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString\n    raise libvirtError(\'virDomainLookupByUUIDString() failed\')\n\nlibvirt.libvirtError: Domain not found: no domain with matching uuid \'b1120a6b-8463-4925-b4b0-3ebf3845041a\'\n\n\nDuring handling of the above exception, another exception occurred:\n\n\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination\n    with excutils.save_and_reraise_exception():\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n\n  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination\n    self.driver.post_live_migration_at_destination(\n\n  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination\n    self._reattach_instance_vifs(context, instance, network_info)\n\n  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs\n    guest = self._host.get_guest(instance)\n            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest\n    return libvirt_guest.Guest(self._get_domain(instance))\n                               ^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain\n    raise exception.InstanceNotFound(instance_id=instance.uuid)\n\nnova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\n\n\nDuring handling of the above exception, another exception occurred:\n\n\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming\n    res = self.dispatcher.dispatch(message)\n          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch\n    return self._do_dispatch(endpoint, method, ctxt, args)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch\n    result = func(ctxt, **new_args)\n             ^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped\n    with excutils.save_and_reraise_exception():\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:16:08 compute-0 nova_compute[186329]: ^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n\n  File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped\n    return f(self, context, *args, **kw)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function\n    return function(self, context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function\n    return function(self, context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination\n    instance.save(expected_task_state=task_states.MIGRATING)\n\n  File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper\n    updates, result = self.indirection_api.object_action(\n                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action\n    return cctxt.call(context, \'object_action\', objinst=objinst,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call\n    result = self.transport._send(\n             ^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send\n    return self._driver.send(target, ctxt, message,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send\n    return self._send(target, ctxt, message, wait_for_reply, timeout,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send\n    raise result\n\nnova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch\n    return getattr(target, method)(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper\n    return fn(self, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save\n    old_ref, inst_ref = db.instance_update_and_get_original(\n                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper\n    with excutils.save_and_reraise_exception() as ectxt:\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper\n    return f(context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original\n    instance_ref = _instance_get_by_uuid(context, instance_uuid,\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid\n    raise exception.InstanceNotFound(instance_id=uuid)\n\nnova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\n\n\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11545, in _live_migration\n    self._live_migration_monitor(context, instance, guest, dest,\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11457, in _live_migration_monitor\n    post_method(context, instance, dest, block_migration,\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9977, in _post_live_migration_update_host\n    instance.save()\n', '  File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper\n    updates, result = self.indirection_api.object_action(\n                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action\n    return cctxt.call(context, \'object_action\', objinst=objinst,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call\n    result = self.transport._send(\n             ^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send\n    return self._driver.send(target, ctxt, message,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send\n    return self._send(target, ctxt, message, wait_for_reply, timeout,\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send\n    raise result\n', 'nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\nTraceback (most recent call last):\n\n  File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch\n    return getattr(target, method)(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper\n    return fn(self, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save\n    old_ref, inst_ref = db.instance_update_and_get_original(\n                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper\n    with excutils.save_and_reraise_exception() as ectxt:\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__\n    self.force_reraise()\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise\n    raise self.value\n\n  File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper\n    return f(*args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper\n    return f(context, *args, **kwargs)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original\n    instance_ref = _instance_get_by_uuid(context, instance_uuid,\n                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid\n    raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:16:08 compute-0 nova_compute[186329]: \n\nnova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\n\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain\n    return conn.lookupByUUIDString(instance.uuid)\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit\n    result = proxy_call(self._autowrap, f, *args, **kwargs)\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call\n    rv = execute(f, *args, **kwargs)\n         ^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute\n    raise e.with_traceback(tb)\n', '  File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker\n    rv = meth(*args, **kwargs)\n         ^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString\n    raise libvirtError(\'virDomainLookupByUUIDString() failed\')\n', "libvirt.libvirtError: Domain not found: no domain with matching uuid 'b1120a6b-8463-4925-b4b0-3ebf3845041a'\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', '  File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 9665, in _do_live_migration\n    self.driver.live_migration(context, instance, dest,\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11005, in live_migration\n    self._live_migration(context, instance, dest,\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11560, in _live_migration\n    self.live_migration_abort(instance)\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11016, in live_migration_abort\n    guest = self._host.get_guest(instance)\n            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest\n    return libvirt_guest.Guest(self._get_domain(instance))\n                               ^^^^^^^^^^^^^^^^^^^^^^^^^^\n', '  File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain\n    raise exception.InstanceNotFound(instance_id=instance.uuid)\n', 'nova.exception.InstanceNotFound: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.\n']: nova.exception_Remote.InstanceNotFound_Remote: Instance b1120a6b-8463-4925-b4b0-3ebf3845041a could not be found.
Dec 05 06:16:08 compute-0 rsyslogd[961]: message too long (8192) with configured size 8096, begin of message is: 2025-12-05 06:16:08.367 186333 ERROR root [None req-6c1032ff-96c9-46ab-9dff-f1d7 [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 06:16:08 compute-0 rsyslogd[961]: message too long (8192) with configured size 8096, begin of message is: ^^^^^^^^^\n\n  File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py",  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Dec 05 06:16:08 compute-0 nova_compute[186329]: 2025-12-05 06:16:08.646 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:09 compute-0 nova_compute[186329]: 2025-12-05 06:16:09.962 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:13 compute-0 nova_compute[186329]: 2025-12-05 06:16:13.646 186333 DEBUG nova.compute.manager [None req-7f675f3a-76d1-4def-9a80-6cf8f2fd7c88 79e343ce21c64b399f0b65fe3f3acc80 fcef582be2274b9ba43451b49b4066ec - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:632
Dec 05 06:16:13 compute-0 nova_compute[186329]: 2025-12-05 06:16:13.649 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:13 compute-0 nova_compute[186329]: 2025-12-05 06:16:13.685 186333 DEBUG nova.compute.provider_tree [None req-7f675f3a-76d1-4def-9a80-6cf8f2fd7c88 79e343ce21c64b399f0b65fe3f3acc80 fcef582be2274b9ba43451b49b4066ec - - default default] Updating resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb generation from 8 to 10 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Dec 05 06:16:14 compute-0 podman[209010]: 2025-12-05 06:16:14.455543205 +0000 UTC m=+0.038266122 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:16:14 compute-0 podman[209009]: 2025-12-05 06:16:14.475741743 +0000 UTC m=+0.059679676 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:16:14 compute-0 nova_compute[186329]: 2025-12-05 06:16:14.964 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:15 compute-0 nova_compute[186329]: 2025-12-05 06:16:15.859 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:18 compute-0 nova_compute[186329]: 2025-12-05 06:16:18.651 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:19 compute-0 nova_compute[186329]: 2025-12-05 06:16:19.965 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:21 compute-0 podman[209053]: 2025-12-05 06:16:21.457351509 +0000 UTC m=+0.041659602 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 05 06:16:21 compute-0 podman[209054]: 2025-12-05 06:16:21.462345228 +0000 UTC m=+0.045695991 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_id=edpm, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Dec 05 06:16:21 compute-0 podman[209055]: 2025-12-05 06:16:21.463901073 +0000 UTC m=+0.045482299 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:16:23 compute-0 nova_compute[186329]: 2025-12-05 06:16:23.652 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:23.767 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:08:4f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a081d805-6a60-4ae4-bab6-0c64058cea31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a081d805-6a60-4ae4-bab6-0c64058cea31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f05c5d52146b475d8173160ea93e10c1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=29a0cd84-860b-46e5-b930-31eb2ef36ecf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6b40e325-0552-4345-8544-736b07859dde) old=Port_Binding(mac=['fa:16:3e:6d:08:4f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a081d805-6a60-4ae4-bab6-0c64058cea31', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a081d805-6a60-4ae4-bab6-0c64058cea31', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f05c5d52146b475d8173160ea93e10c1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:16:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:23.768 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6b40e325-0552-4345-8544-736b07859dde in datapath a081d805-6a60-4ae4-bab6-0c64058cea31 updated
Dec 05 06:16:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:23.768 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a081d805-6a60-4ae4-bab6-0c64058cea31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:16:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:23.769 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5467b654-dfc9-4826-bb8c-78678dd2340c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:16:24 compute-0 nova_compute[186329]: 2025-12-05 06:16:24.967 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:28 compute-0 nova_compute[186329]: 2025-12-05 06:16:28.652 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.157 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:b2:dc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f6144a90-8032-4b70-9199-37ed796a58da', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6144a90-8032-4b70-9199-37ed796a58da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd1cf83c63cb24c9893c185a8a835526b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=956272b8-c2a4-457b-b980-5a53e4d07b3a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=99fc309b-546e-4af4-8d56-00cbb2e20c95) old=Port_Binding(mac=['fa:16:3e:62:b2:dc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f6144a90-8032-4b70-9199-37ed796a58da', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6144a90-8032-4b70-9199-37ed796a58da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd1cf83c63cb24c9893c185a8a835526b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.158 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 99fc309b-546e-4af4-8d56-00cbb2e20c95 in datapath f6144a90-8032-4b70-9199-37ed796a58da updated
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.158 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6144a90-8032-4b70-9199-37ed796a58da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.159 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[161db71b-d078-4802-89a6-a548347d946d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.499 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.499 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:29.499 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:29 compute-0 podman[196599]: time="2025-12-05T06:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:16:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:16:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2580 "" "Go-http-client/1.1"
Dec 05 06:16:29 compute-0 nova_compute[186329]: 2025-12-05 06:16:29.968 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: ERROR   06:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: ERROR   06:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: ERROR   06:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: ERROR   06:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: ERROR   06:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:16:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:16:33 compute-0 nova_compute[186329]: 2025-12-05 06:16:33.654 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:34 compute-0 nova_compute[186329]: 2025-12-05 06:16:34.970 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:38 compute-0 nova_compute[186329]: 2025-12-05 06:16:38.657 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:39 compute-0 nova_compute[186329]: 2025-12-05 06:16:39.972 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:41 compute-0 nova_compute[186329]: 2025-12-05 06:16:41.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:43 compute-0 nova_compute[186329]: 2025-12-05 06:16:43.658 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:44 compute-0 nova_compute[186329]: 2025-12-05 06:16:44.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:44 compute-0 nova_compute[186329]: 2025-12-05 06:16:44.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:16:44 compute-0 nova_compute[186329]: 2025-12-05 06:16:44.974 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:45 compute-0 podman[209106]: 2025-12-05 06:16:45.468367916 +0000 UTC m=+0.045054014 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:16:45 compute-0 podman[209105]: 2025-12-05 06:16:45.487506742 +0000 UTC m=+0.065025607 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:16:45 compute-0 nova_compute[186329]: 2025-12-05 06:16:45.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:45 compute-0 nova_compute[186329]: 2025-12-05 06:16:45.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:45 compute-0 nova_compute[186329]: 2025-12-05 06:16:45.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.221 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.387 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.388 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.402 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.403 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5862MB free_disk=73.16918182373047GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.403 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:16:46 compute-0 nova_compute[186329]: 2025-12-05 06:16:46.403 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:16:47 compute-0 ovn_controller[95223]: 2025-12-05T06:16:47Z|00075|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 06:16:47 compute-0 nova_compute[186329]: 2025-12-05 06:16:47.940 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:16:47 compute-0 nova_compute[186329]: 2025-12-05 06:16:47.940 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:16:47 compute-0 nova_compute[186329]: 2025-12-05 06:16:47.941 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:16:46 up 54 min,  0 user,  load average: 0.16, 0.25, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:16:47 compute-0 nova_compute[186329]: 2025-12-05 06:16:47.967 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:16:48 compute-0 nova_compute[186329]: 2025-12-05 06:16:48.471 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:16:48 compute-0 nova_compute[186329]: 2025-12-05 06:16:48.660 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:48 compute-0 nova_compute[186329]: 2025-12-05 06:16:48.979 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:16:48 compute-0 nova_compute[186329]: 2025-12-05 06:16:48.979 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.576s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:16:49 compute-0 nova_compute[186329]: 2025-12-05 06:16:49.975 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:50 compute-0 nova_compute[186329]: 2025-12-05 06:16:50.975 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:50 compute-0 nova_compute[186329]: 2025-12-05 06:16:50.976 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:50 compute-0 nova_compute[186329]: 2025-12-05 06:16:50.976 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:52 compute-0 podman[209151]: 2025-12-05 06:16:52.462320147 +0000 UTC m=+0.046792643 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:16:52 compute-0 podman[209152]: 2025-12-05 06:16:52.467813737 +0000 UTC m=+0.049686565 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 06:16:52 compute-0 podman[209153]: 2025-12-05 06:16:52.482385746 +0000 UTC m=+0.062144619 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:16:52 compute-0 nova_compute[186329]: 2025-12-05 06:16:52.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:16:53 compute-0 nova_compute[186329]: 2025-12-05 06:16:53.662 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:54.679 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:f8:9d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36f184a410c54894823168ed0f00b1ce', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=11b2e7a6-c4ec-4f31-8535-807d9ce71179) old=Port_Binding(mac=['fa:16:3e:a7:f8:9d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36f184a410c54894823168ed0f00b1ce', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:16:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:54.679 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 11b2e7a6-c4ec-4f31-8535-807d9ce71179 in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 updated
Dec 05 06:16:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:54.680 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:16:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:54.681 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[be2daa75-b9c8-448f-81a9-2429097094be]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:16:54 compute-0 nova_compute[186329]: 2025-12-05 06:16:54.977 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:55.162 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:16:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:16:55.162 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:16:55 compute-0 nova_compute[186329]: 2025-12-05 06:16:55.164 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:58 compute-0 nova_compute[186329]: 2025-12-05 06:16:58.663 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:16:59 compute-0 podman[196599]: time="2025-12-05T06:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:16:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:16:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2581 "" "Go-http-client/1.1"
Dec 05 06:16:59 compute-0 nova_compute[186329]: 2025-12-05 06:16:59.979 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: ERROR   06:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: ERROR   06:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: ERROR   06:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: ERROR   06:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: ERROR   06:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:17:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:17:03 compute-0 nova_compute[186329]: 2025-12-05 06:17:03.666 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:04 compute-0 nova_compute[186329]: 2025-12-05 06:17:04.981 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:05 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:05.163 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:06 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:06.475 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:ba:a9 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-933694a8-8de7-4ed5-8258-3770be3b65f8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-933694a8-8de7-4ed5-8258-3770be3b65f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c6996580-1c16-419b-92c5-35f982a28b72, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7e2a8c7b-cb15-4e35-95bb-40ffe505efb1) old=Port_Binding(mac=['fa:16:3e:48:ba:a9'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-933694a8-8de7-4ed5-8258-3770be3b65f8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-933694a8-8de7-4ed5-8258-3770be3b65f8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:17:06 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:06.476 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7e2a8c7b-cb15-4e35-95bb-40ffe505efb1 in datapath 933694a8-8de7-4ed5-8258-3770be3b65f8 updated
Dec 05 06:17:06 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:06.476 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 933694a8-8de7-4ed5-8258-3770be3b65f8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:17:06 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:06.477 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9365b4f3-faa6-4f23-93b2-bd95bb1c0028]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:08 compute-0 nova_compute[186329]: 2025-12-05 06:17:08.667 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:09 compute-0 nova_compute[186329]: 2025-12-05 06:17:09.983 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:13 compute-0 nova_compute[186329]: 2025-12-05 06:17:13.669 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:14 compute-0 nova_compute[186329]: 2025-12-05 06:17:14.986 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:16 compute-0 podman[209207]: 2025-12-05 06:17:16.45981721 +0000 UTC m=+0.037336591 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:17:16 compute-0 podman[209206]: 2025-12-05 06:17:16.481481502 +0000 UTC m=+0.066939638 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 05 06:17:18 compute-0 nova_compute[186329]: 2025-12-05 06:17:18.671 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:19 compute-0 nova_compute[186329]: 2025-12-05 06:17:19.988 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:23 compute-0 podman[209252]: 2025-12-05 06:17:23.464341919 +0000 UTC m=+0.046624152 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec 05 06:17:23 compute-0 podman[209253]: 2025-12-05 06:17:23.467042337 +0000 UTC m=+0.047922344 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 05 06:17:23 compute-0 podman[209251]: 2025-12-05 06:17:23.484566264 +0000 UTC m=+0.070430973 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:17:23 compute-0 nova_compute[186329]: 2025-12-05 06:17:23.672 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:24 compute-0 nova_compute[186329]: 2025-12-05 06:17:24.990 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:28 compute-0 nova_compute[186329]: 2025-12-05 06:17:28.673 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:29.500 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:29.500 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:29.500 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:29 compute-0 podman[196599]: time="2025-12-05T06:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:17:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:17:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2584 "" "Go-http-client/1.1"
Dec 05 06:17:29 compute-0 nova_compute[186329]: 2025-12-05 06:17:29.991 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: ERROR   06:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: ERROR   06:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: ERROR   06:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: ERROR   06:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: ERROR   06:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:17:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:17:33 compute-0 nova_compute[186329]: 2025-12-05 06:17:33.088 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:33 compute-0 nova_compute[186329]: 2025-12-05 06:17:33.088 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:33 compute-0 nova_compute[186329]: 2025-12-05 06:17:33.591 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:17:33 compute-0 nova_compute[186329]: 2025-12-05 06:17:33.674 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:34 compute-0 nova_compute[186329]: 2025-12-05 06:17:34.121 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:34 compute-0 nova_compute[186329]: 2025-12-05 06:17:34.122 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:34 compute-0 nova_compute[186329]: 2025-12-05 06:17:34.126 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:17:34 compute-0 nova_compute[186329]: 2025-12-05 06:17:34.126 186333 INFO nova.compute.claims [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:17:34 compute-0 nova_compute[186329]: 2025-12-05 06:17:34.993 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.150 186333 DEBUG nova.scheduler.client.report [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.162 186333 DEBUG nova.scheduler.client.report [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.162 186333 DEBUG nova.compute.provider_tree [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.171 186333 DEBUG nova.scheduler.client.report [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.185 186333 DEBUG nova.scheduler.client.report [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.218 186333 DEBUG nova.compute.provider_tree [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:17:35 compute-0 nova_compute[186329]: 2025-12-05 06:17:35.722 186333 DEBUG nova.scheduler.client.report [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.229 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.229 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.736 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.736 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.737 186333 WARNING neutronclient.v2_0.client [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:36 compute-0 nova_compute[186329]: 2025-12-05 06:17:36.737 186333 WARNING neutronclient.v2_0.client [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:37 compute-0 nova_compute[186329]: 2025-12-05 06:17:37.241 186333 INFO nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:17:37 compute-0 nova_compute[186329]: 2025-12-05 06:17:37.553 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Successfully created port: c00622c0-b6d3-4225-8e33-44b59cb8a42a _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:17:37 compute-0 nova_compute[186329]: 2025-12-05 06:17:37.747 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:17:37 compute-0 nova_compute[186329]: 2025-12-05 06:17:37.984 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Successfully updated port: c00622c0-b6d3-4225-8e33-44b59cb8a42a _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.022 186333 DEBUG nova.compute.manager [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-changed-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.022 186333 DEBUG nova.compute.manager [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Refreshing instance network info cache due to event network-changed-c00622c0-b6d3-4225-8e33-44b59cb8a42a. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.022 186333 DEBUG oslo_concurrency.lockutils [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.022 186333 DEBUG oslo_concurrency.lockutils [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.022 186333 DEBUG nova.network.neutron [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Refreshing network info cache for port c00622c0-b6d3-4225-8e33-44b59cb8a42a _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.488 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.525 186333 WARNING neutronclient.v2_0.client [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.675 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.757 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.758 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.758 186333 INFO nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Creating image(s)
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.759 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.759 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.759 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.760 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.762 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.765 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.808 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.809 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.809 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.810 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.812 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.812 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.854 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.855 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.876 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.876 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.877 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.920 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.921 186333 DEBUG nova.virt.disk.api [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Checking if we can resize image /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.921 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.965 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.965 186333 DEBUG nova.virt.disk.api [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Cannot resize image /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.966 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.966 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Ensure instance console log exists: /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.966 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.967 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:38 compute-0 nova_compute[186329]: 2025-12-05 06:17:38.967 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:39 compute-0 nova_compute[186329]: 2025-12-05 06:17:39.486 186333 DEBUG nova.network.neutron [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:17:39 compute-0 nova_compute[186329]: 2025-12-05 06:17:39.672 186333 DEBUG nova.network.neutron [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:17:39 compute-0 nova_compute[186329]: 2025-12-05 06:17:39.995 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:40 compute-0 nova_compute[186329]: 2025-12-05 06:17:40.177 186333 DEBUG oslo_concurrency.lockutils [req-2cc1a3ff-893f-4055-806c-44742a81cc33 req-edd31a83-ff46-4f5c-a14d-fcae78c74c57 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:17:40 compute-0 nova_compute[186329]: 2025-12-05 06:17:40.177 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquired lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:17:40 compute-0 nova_compute[186329]: 2025-12-05 06:17:40.177 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:17:41 compute-0 nova_compute[186329]: 2025-12-05 06:17:41.471 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:17:41 compute-0 nova_compute[186329]: 2025-12-05 06:17:41.631 186333 WARNING neutronclient.v2_0.client [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:42 compute-0 nova_compute[186329]: 2025-12-05 06:17:42.499 186333 DEBUG nova.network.neutron [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Updating instance_info_cache with network_info: [{"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.003 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Releasing lock "refresh_cache-34c83f7e-4b49-4a20-b482-394fe5b63c68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.004 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance network_info: |[{"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.005 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Start _get_guest_xml network_info=[{"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.008 186333 WARNING nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.009 186333 DEBUG nova.virt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-194938820', uuid='34c83f7e-4b49-4a20-b482-394fe5b63c68'), owner=OwnerMeta(userid='6e966152abb6429a8d2dc82faf5464b5', username='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin', projectid='72c4ee5cc96a42b99210abaf8ae6fcc3', projectname='tempest-TestExecuteHostMaintenanceStrategy-1144698866'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915463.0094936) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.014 186333 DEBUG nova.virt.libvirt.host [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.014 186333 DEBUG nova.virt.libvirt.host [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.016 186333 DEBUG nova.virt.libvirt.host [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.017 186333 DEBUG nova.virt.libvirt.host [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.018 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.018 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.018 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.018 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.018 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.019 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.020 186333 DEBUG nova.virt.hardware [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.022 186333 DEBUG nova.virt.libvirt.vif [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:17:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-194938820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-194938820',id=9,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-ly8dat25',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:17:37Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=34c83f7e-4b49-4a20-b482-394fe5b63c68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.022 186333 DEBUG nova.network.os_vif_util [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.023 186333 DEBUG nova.network.os_vif_util [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.024 186333 DEBUG nova.objects.instance [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34c83f7e-4b49-4a20-b482-394fe5b63c68 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.529 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <uuid>34c83f7e-4b49-4a20-b482-394fe5b63c68</uuid>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <name>instance-00000009</name>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-194938820</nova:name>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:17:43</nova:creationTime>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:17:43 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:17:43 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:user uuid="6e966152abb6429a8d2dc82faf5464b5">tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin</nova:user>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:project uuid="72c4ee5cc96a42b99210abaf8ae6fcc3">tempest-TestExecuteHostMaintenanceStrategy-1144698866</nova:project>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         <nova:port uuid="c00622c0-b6d3-4225-8e33-44b59cb8a42a">
Dec 05 06:17:43 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <system>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="serial">34c83f7e-4b49-4a20-b482-394fe5b63c68</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="uuid">34c83f7e-4b49-4a20-b482-394fe5b63c68</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </system>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <os>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </os>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <features>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </features>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.config"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:d2:4d:71"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <target dev="tapc00622c0-b6"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/console.log" append="off"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <video>
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </video>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:17:43 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:17:43 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:17:43 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:17:43 compute-0 nova_compute[186329]: </domain>
Dec 05 06:17:43 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.530 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Preparing to wait for external event network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.530 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.530 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.530 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.531 186333 DEBUG nova.virt.libvirt.vif [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:17:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-194938820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-194938820',id=9,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-ly8dat25',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:17:37Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=34c83f7e-4b49-4a20-b482-394fe5b63c68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.531 186333 DEBUG nova.network.os_vif_util [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.532 186333 DEBUG nova.network.os_vif_util [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.532 186333 DEBUG os_vif [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.532 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.532 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.533 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.533 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.533 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '458d17bb-a27f-534c-808f-d9c151b8b7c2', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.534 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.535 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.537 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.537 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc00622c0-b6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.538 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc00622c0-b6, col_values=(('qos', UUID('530f62a2-9864-4522-8781-8870e95db287')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.538 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc00622c0-b6, col_values=(('external_ids', {'iface-id': 'c00622c0-b6d3-4225-8e33-44b59cb8a42a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d2:4d:71', 'vm-uuid': '34c83f7e-4b49-4a20-b482-394fe5b63c68'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.539 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 NetworkManager[55434]: <info>  [1764915463.5399] manager: (tapc00622c0-b6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.541 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.543 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.543 186333 INFO os_vif [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6')
Dec 05 06:17:43 compute-0 nova_compute[186329]: 2025-12-05 06:17:43.676 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.068 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.068 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.068 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No VIF found with MAC fa:16:3e:d2:4d:71, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.068 186333 INFO nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Using config drive
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.575 186333 WARNING neutronclient.v2_0.client [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:45 compute-0 nova_compute[186329]: 2025-12-05 06:17:45.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.220 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.577 186333 INFO nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Creating config drive at /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.config
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.582 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmposuiybqy execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.701 186333 DEBUG oslo_concurrency.processutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmposuiybqy" returned: 0 in 0.119s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.7397] manager: (tapc00622c0-b6): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec 05 06:17:46 compute-0 kernel: tapc00622c0-b6: entered promiscuous mode
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.741 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:46 compute-0 ovn_controller[95223]: 2025-12-05T06:17:46Z|00076|binding|INFO|Claiming lport c00622c0-b6d3-4225-8e33-44b59cb8a42a for this chassis.
Dec 05 06:17:46 compute-0 ovn_controller[95223]: 2025-12-05T06:17:46Z|00077|binding|INFO|c00622c0-b6d3-4225-8e33-44b59cb8a42a: Claiming fa:16:3e:d2:4d:71 10.100.0.4
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.750 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:4d:71 10.100.0.4'], port_security=['fa:16:3e:d2:4d:71 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '34c83f7e-4b49-4a20-b482-394fe5b63c68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=c00622c0-b6d3-4225-8e33-44b59cb8a42a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.756 104041 INFO neutron.agent.ovn.metadata.agent [-] Port c00622c0-b6d3-4225-8e33-44b59cb8a42a in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 bound to our chassis
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.756 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.769 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1a77bf34-d301-4b81-aafa-193a4bd9cbf1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.771 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf5a1a6f-31 in ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.772 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf5a1a6f-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.772 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[98749dfb-3b26-46de-9b65-1d0f7547d221]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.779 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[31fab1d4-034e-482c-b868-ff919e174601]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 systemd-udevd[209356]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.7922] device (tapc00622c0-b6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.7930] device (tapc00622c0-b6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:17:46 compute-0 systemd-machined[152967]: New machine qemu-5-instance-00000009.
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.799 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[36681f11-3b1a-4725-9f85-f93a0940fc3a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000009.
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.806 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:46 compute-0 ovn_controller[95223]: 2025-12-05T06:17:46Z|00078|binding|INFO|Setting lport c00622c0-b6d3-4225-8e33-44b59cb8a42a ovn-installed in OVS
Dec 05 06:17:46 compute-0 ovn_controller[95223]: 2025-12-05T06:17:46Z|00079|binding|INFO|Setting lport c00622c0-b6d3-4225-8e33-44b59cb8a42a up in Southbound
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.810 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.816 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4059ee-7539-4129-8707-e87556f6bbd9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 podman[209323]: 2025-12-05 06:17:46.836441489 +0000 UTC m=+0.107984120 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.842 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[c30ce9f5-8293-4854-bda5-f5145db5f0a7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.8451] manager: (tapdf5a1a6f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec 05 06:17:46 compute-0 systemd-udevd[209364]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.846 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c85f0dc7-a3a2-471c-bd1c-301205418f61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 podman[209328]: 2025-12-05 06:17:46.856692574 +0000 UTC m=+0.125379996 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.871 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e4b61350-990f-4502-9472-ed2a2d595b9b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.875 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[16818b72-0cdc-44fc-b9b3-660afcc89f64]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.8920] device (tapdf5a1a6f-30): carrier: link connected
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.895 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[7c30b264-6f90-447a-b069-623ff3f4c9c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.908 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee608cc-d4c1-401c-90fc-d7ec0892a085]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334542, 'reachable_time': 40776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209407, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.912 186333 DEBUG nova.compute.manager [req-ad3541df-371e-4406-9169-281e851b403a req-a0bd9308-9aca-4808-9452-16210888e734 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.913 186333 DEBUG oslo_concurrency.lockutils [req-ad3541df-371e-4406-9169-281e851b403a req-a0bd9308-9aca-4808-9452-16210888e734 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.913 186333 DEBUG oslo_concurrency.lockutils [req-ad3541df-371e-4406-9169-281e851b403a req-a0bd9308-9aca-4808-9452-16210888e734 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.913 186333 DEBUG oslo_concurrency.lockutils [req-ad3541df-371e-4406-9169-281e851b403a req-a0bd9308-9aca-4808-9452-16210888e734 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.914 186333 DEBUG nova.compute.manager [req-ad3541df-371e-4406-9169-281e851b403a req-a0bd9308-9aca-4808-9452-16210888e734 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Processing event network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.917 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[636e3758-f3e6-43d5-b90e-2be026217445]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:f89d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334542, 'tstamp': 334542}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209409, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.928 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c9207e6a-7e55-4752-a951-618b971a5b22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334542, 'reachable_time': 40776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209410, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.949 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[43b21beb-767f-4485-a348-b45d1a0b8d82]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.987 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b20273-8db6-468f-bb41-3c970e1008ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.988 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.988 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.988 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:46 compute-0 kernel: tapdf5a1a6f-30: entered promiscuous mode
Dec 05 06:17:46 compute-0 NetworkManager[55434]: <info>  [1764915466.9914] manager: (tapdf5a1a6f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.991 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.992 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:46 compute-0 nova_compute[186329]: 2025-12-05 06:17:46.994 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:46 compute-0 ovn_controller[95223]: 2025-12-05T06:17:46Z|00080|binding|INFO|Releasing lport 11b2e7a6-c4ec-4f31-8535-807d9ce71179 from this chassis (sb_readonly=0)
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.999 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b2e6d431-020c-4047-9fa2-6dd1853b3e3e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.999 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:17:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.999 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:46.999 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.000 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.000 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ad2541-c14a-4527-8f43-b3116c78f299]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.000 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.001 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8d557d39-7a23-4d42-9291-a5af3ea6999a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.001 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:17:47 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:17:47.002 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'env', 'PROCESS_TAG=haproxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.006 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.252 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.256 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.263 186333 INFO nova.virt.libvirt.driver [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance spawned successfully.
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.264 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.293 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.293 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:47 compute-0 podman[209446]: 2025-12-05 06:17:47.335751437 +0000 UTC m=+0.047285205 container create 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.343 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:47 compute-0 systemd[1]: Started libpod-conmon-579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3.scope.
Dec 05 06:17:47 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:17:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6295eea2b9baea2078ea4e8cbceb523233f475adf88d43760b474a6a2b597cd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:17:47 compute-0 podman[209446]: 2025-12-05 06:17:47.399335587 +0000 UTC m=+0.110869374 container init 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 05 06:17:47 compute-0 podman[209446]: 2025-12-05 06:17:47.405566642 +0000 UTC m=+0.117100410 container start 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:17:47 compute-0 podman[209446]: 2025-12-05 06:17:47.321161436 +0000 UTC m=+0.032695223 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:17:47 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [NOTICE]   (209469) : New worker (209471) forked
Dec 05 06:17:47 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [NOTICE]   (209469) : Loading success.
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.577 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.580 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.595 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.596 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.16900634765625GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.596 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.596 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.776 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.776 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.777 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.777 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.777 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:47 compute-0 nova_compute[186329]: 2025-12-05 06:17:47.778 186333 DEBUG nova.virt.libvirt.driver [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.286 186333 INFO nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Took 9.53 seconds to spawn the instance on the hypervisor.
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.286 186333 DEBUG nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.541 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.677 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.806 186333 INFO nova.compute.manager [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Took 14.71 seconds to build instance.
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.982 186333 DEBUG nova.compute.manager [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.982 186333 DEBUG oslo_concurrency.lockutils [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.983 186333 DEBUG oslo_concurrency.lockutils [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.983 186333 DEBUG oslo_concurrency.lockutils [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.983 186333 DEBUG nova.compute.manager [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] No waiting events found dispatching network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:17:48 compute-0 nova_compute[186329]: 2025-12-05 06:17:48.983 186333 WARNING nova.compute.manager [req-7ab17be9-8dd7-4ed4-9eee-b8f6794bbd75 req-56148d50-896c-42b5-97c4-da9cd37bfa17 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received unexpected event network-vif-plugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a for instance with vm_state active and task_state None.
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.135 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.135 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 34c83f7e-4b49-4a20-b482-394fe5b63c68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.136 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.136 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:17:47 up 55 min,  0 user,  load average: 0.21, 0.23, 0.32\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_72c4ee5cc96a42b99210abaf8ae6fcc3': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.175 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.312 186333 DEBUG oslo_concurrency.lockutils [None req-2ef9fb51-ea21-4c7a-bae3-cbc497cc0c09 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.224s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:49 compute-0 nova_compute[186329]: 2025-12-05 06:17:49.680 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:17:50 compute-0 nova_compute[186329]: 2025-12-05 06:17:50.186 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:17:50 compute-0 nova_compute[186329]: 2025-12-05 06:17:50.186 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:17:51 compute-0 nova_compute[186329]: 2025-12-05 06:17:51.186 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:51 compute-0 nova_compute[186329]: 2025-12-05 06:17:51.186 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:51 compute-0 nova_compute[186329]: 2025-12-05 06:17:51.186 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:51 compute-0 nova_compute[186329]: 2025-12-05 06:17:51.186 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:51 compute-0 nova_compute[186329]: 2025-12-05 06:17:51.187 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:17:52 compute-0 nova_compute[186329]: 2025-12-05 06:17:52.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:17:53 compute-0 nova_compute[186329]: 2025-12-05 06:17:53.544 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:53 compute-0 nova_compute[186329]: 2025-12-05 06:17:53.679 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:54 compute-0 podman[209477]: 2025-12-05 06:17:54.48145723 +0000 UTC m=+0.053234110 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:17:54 compute-0 podman[209478]: 2025-12-05 06:17:54.491425994 +0000 UTC m=+0.061946692 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=)
Dec 05 06:17:54 compute-0 podman[209479]: 2025-12-05 06:17:54.526553881 +0000 UTC m=+0.093206876 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Dec 05 06:17:58 compute-0 ovn_controller[95223]: 2025-12-05T06:17:58Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d2:4d:71 10.100.0.4
Dec 05 06:17:58 compute-0 ovn_controller[95223]: 2025-12-05T06:17:58Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d2:4d:71 10.100.0.4
Dec 05 06:17:58 compute-0 nova_compute[186329]: 2025-12-05 06:17:58.546 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:58 compute-0 nova_compute[186329]: 2025-12-05 06:17:58.680 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:17:59 compute-0 nova_compute[186329]: 2025-12-05 06:17:59.701 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Creating tmpfile /var/lib/nova/instances/tmpznsgapyx to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:17:59 compute-0 nova_compute[186329]: 2025-12-05 06:17:59.702 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:17:59 compute-0 podman[196599]: time="2025-12-05T06:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:17:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:17:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3047 "" "Go-http-client/1.1"
Dec 05 06:17:59 compute-0 nova_compute[186329]: 2025-12-05 06:17:59.778 186333 DEBUG nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpznsgapyx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: ERROR   06:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: ERROR   06:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: ERROR   06:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: ERROR   06:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: ERROR   06:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:18:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:18:01 compute-0 nova_compute[186329]: 2025-12-05 06:18:01.803 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:03 compute-0 nova_compute[186329]: 2025-12-05 06:18:03.549 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:03 compute-0 nova_compute[186329]: 2025-12-05 06:18:03.682 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:06 compute-0 nova_compute[186329]: 2025-12-05 06:18:06.255 186333 DEBUG nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpznsgapyx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8937fe36-e7a1-49d6-947a-6627f5bbfffb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:18:07 compute-0 nova_compute[186329]: 2025-12-05 06:18:07.265 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:18:07 compute-0 nova_compute[186329]: 2025-12-05 06:18:07.266 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:18:07 compute-0 nova_compute[186329]: 2025-12-05 06:18:07.266 186333 DEBUG nova.network.neutron [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:18:07 compute-0 nova_compute[186329]: 2025-12-05 06:18:07.770 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:08 compute-0 nova_compute[186329]: 2025-12-05 06:18:08.550 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:08 compute-0 nova_compute[186329]: 2025-12-05 06:18:08.683 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:09 compute-0 nova_compute[186329]: 2025-12-05 06:18:09.666 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:09 compute-0 nova_compute[186329]: 2025-12-05 06:18:09.833 186333 DEBUG nova.network.neutron [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Updating instance_info_cache with network_info: [{"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.338 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.344 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpznsgapyx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8937fe36-e7a1-49d6-947a-6627f5bbfffb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.345 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Creating instance directory: /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.345 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Creating disk.info with the contents: {'/var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk': 'qcow2', '/var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.346 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.346 186333 DEBUG nova.objects.instance [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8937fe36-e7a1-49d6-947a-6627f5bbfffb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.850 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.853 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.854 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.897 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.898 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.898 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.899 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.901 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.902 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.943 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.944 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.964 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.965 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.067s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:10 compute-0 nova_compute[186329]: 2025-12-05 06:18:10.966 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.007 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.008 186333 DEBUG nova.virt.disk.api [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.008 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.050 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.051 186333 DEBUG nova.virt.disk.api [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.051 186333 DEBUG nova.objects.instance [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 8937fe36-e7a1-49d6-947a-6627f5bbfffb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.556 186333 DEBUG nova.objects.base [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<8937fe36-e7a1-49d6-947a-6627f5bbfffb> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.557 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.575 186333 DEBUG oslo_concurrency.processutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk.config 497664" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.576 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.577 186333 DEBUG nova.virt.libvirt.vif [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:17:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1095450282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1095450282',id=8,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:17:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-0dzsfd99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:17:28Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=8937fe36-e7a1-49d6-947a-6627f5bbfffb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.577 186333 DEBUG nova.network.os_vif_util [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.578 186333 DEBUG nova.network.os_vif_util [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.579 186333 DEBUG os_vif [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.581 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.581 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.581 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.582 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.582 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2dad0306-f670-51d7-808a-fa8addaefe0b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.583 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.584 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.586 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.587 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5f3cbbb-36, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.587 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc5f3cbbb-36, col_values=(('qos', UUID('003504fb-80c0-4474-a9fa-bbce4a024f1b')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.587 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc5f3cbbb-36, col_values=(('external_ids', {'iface-id': 'c5f3cbbb-3662-4096-8009-80a6ae4ad778', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:3d:38', 'vm-uuid': '8937fe36-e7a1-49d6-947a-6627f5bbfffb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.588 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 NetworkManager[55434]: <info>  [1764915491.5892] manager: (tapc5f3cbbb-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.590 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.593 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.594 186333 INFO os_vif [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36')
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.595 186333 DEBUG nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.595 186333 DEBUG nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpznsgapyx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8937fe36-e7a1-49d6-947a-6627f5bbfffb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:18:11 compute-0 nova_compute[186329]: 2025-12-05 06:18:11.596 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:12 compute-0 nova_compute[186329]: 2025-12-05 06:18:12.488 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:13.495 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:18:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:13.496 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:18:13 compute-0 nova_compute[186329]: 2025-12-05 06:18:13.496 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:13 compute-0 nova_compute[186329]: 2025-12-05 06:18:13.685 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:14 compute-0 nova_compute[186329]: 2025-12-05 06:18:14.587 186333 DEBUG nova.network.neutron [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Port c5f3cbbb-3662-4096-8009-80a6ae4ad778 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:18:14 compute-0 nova_compute[186329]: 2025-12-05 06:18:14.595 186333 DEBUG nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpznsgapyx',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8937fe36-e7a1-49d6-947a-6627f5bbfffb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.497 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:16 compute-0 nova_compute[186329]: 2025-12-05 06:18:16.589 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:16 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 06:18:16 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 06:18:16 compute-0 kernel: tapc5f3cbbb-36: entered promiscuous mode
Dec 05 06:18:16 compute-0 NetworkManager[55434]: <info>  [1764915496.9406] manager: (tapc5f3cbbb-36): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Dec 05 06:18:16 compute-0 nova_compute[186329]: 2025-12-05 06:18:16.943 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:16 compute-0 ovn_controller[95223]: 2025-12-05T06:18:16Z|00081|binding|INFO|Claiming lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 for this additional chassis.
Dec 05 06:18:16 compute-0 ovn_controller[95223]: 2025-12-05T06:18:16Z|00082|binding|INFO|c5f3cbbb-3662-4096-8009-80a6ae4ad778: Claiming fa:16:3e:23:3d:38 10.100.0.13
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.948 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:3d:38 10.100.0.13'], port_security=['fa:16:3e:23:3d:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8937fe36-e7a1-49d6-947a-6627f5bbfffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=c5f3cbbb-3662-4096-8009-80a6ae4ad778) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.955 104041 INFO neutron.agent.ovn.metadata.agent [-] Port c5f3cbbb-3662-4096-8009-80a6ae4ad778 in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.955 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.967 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dae490c2-97db-4259-85cc-480e31c4bbca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:16 compute-0 systemd-udevd[209613]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:18:16 compute-0 ovn_controller[95223]: 2025-12-05T06:18:16Z|00083|binding|INFO|Setting lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 ovn-installed in OVS
Dec 05 06:18:16 compute-0 nova_compute[186329]: 2025-12-05 06:18:16.973 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:16 compute-0 nova_compute[186329]: 2025-12-05 06:18:16.974 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:16 compute-0 systemd-machined[152967]: New machine qemu-6-instance-00000008.
Dec 05 06:18:16 compute-0 nova_compute[186329]: 2025-12-05 06:18:16.981 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:16 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Dec 05 06:18:16 compute-0 NetworkManager[55434]: <info>  [1764915496.9854] device (tapc5f3cbbb-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:18:16 compute-0 NetworkManager[55434]: <info>  [1764915496.9859] device (tapc5f3cbbb-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.994 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[4adb7877-4fa1-4172-9562-6dcb1d177841]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:16 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:16.995 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[969712b3-d52b-4b11-a091-2e1830826b76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.023 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[dc653294-b0e7-4d9a-b4d7-8e133d02057d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:17 compute-0 podman[209588]: 2025-12-05 06:18:17.031755557 +0000 UTC m=+0.096751340 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.043 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[90f49e25-84c5-41b0-8b85-eeac2aa21be6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334542, 'reachable_time': 40776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209645, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.054 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[74aa2f81-1397-42a0-9dec-ba51d745bcff]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334550, 'tstamp': 334550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209654, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334551, 'tstamp': 334551}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209654, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.055 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:17 compute-0 nova_compute[186329]: 2025-12-05 06:18:17.056 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:17 compute-0 nova_compute[186329]: 2025-12-05 06:18:17.057 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.057 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.057 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.057 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.058 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:18:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:17.059 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[af926e01-63cd-4925-86c1-1e1d51c70c8b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:17 compute-0 podman[209589]: 2025-12-05 06:18:17.063549514 +0000 UTC m=+0.126368084 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:18:18 compute-0 nova_compute[186329]: 2025-12-05 06:18:18.686 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:19 compute-0 ovn_controller[95223]: 2025-12-05T06:18:19Z|00084|binding|INFO|Claiming lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 for this chassis.
Dec 05 06:18:19 compute-0 ovn_controller[95223]: 2025-12-05T06:18:19Z|00085|binding|INFO|c5f3cbbb-3662-4096-8009-80a6ae4ad778: Claiming fa:16:3e:23:3d:38 10.100.0.13
Dec 05 06:18:19 compute-0 ovn_controller[95223]: 2025-12-05T06:18:19Z|00086|binding|INFO|Setting lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 up in Southbound
Dec 05 06:18:21 compute-0 nova_compute[186329]: 2025-12-05 06:18:21.592 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:21 compute-0 nova_compute[186329]: 2025-12-05 06:18:21.704 186333 INFO nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Post operation of migration started
Dec 05 06:18:21 compute-0 nova_compute[186329]: 2025-12-05 06:18:21.705 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:22 compute-0 nova_compute[186329]: 2025-12-05 06:18:22.488 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:22 compute-0 nova_compute[186329]: 2025-12-05 06:18:22.488 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:22 compute-0 nova_compute[186329]: 2025-12-05 06:18:22.556 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:18:22 compute-0 nova_compute[186329]: 2025-12-05 06:18:22.556 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:18:22 compute-0 nova_compute[186329]: 2025-12-05 06:18:22.556 186333 DEBUG nova.network.neutron [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:18:23 compute-0 nova_compute[186329]: 2025-12-05 06:18:23.060 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:23 compute-0 nova_compute[186329]: 2025-12-05 06:18:23.688 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:23 compute-0 nova_compute[186329]: 2025-12-05 06:18:23.725 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:23 compute-0 nova_compute[186329]: 2025-12-05 06:18:23.865 186333 DEBUG nova.network.neutron [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Updating instance_info_cache with network_info: [{"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:24 compute-0 nova_compute[186329]: 2025-12-05 06:18:24.369 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-8937fe36-e7a1-49d6-947a-6627f5bbfffb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:18:24 compute-0 nova_compute[186329]: 2025-12-05 06:18:24.880 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:24 compute-0 nova_compute[186329]: 2025-12-05 06:18:24.880 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:24 compute-0 nova_compute[186329]: 2025-12-05 06:18:24.880 186333 DEBUG oslo_concurrency.lockutils [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:24 compute-0 nova_compute[186329]: 2025-12-05 06:18:24.884 186333 INFO nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:18:24 compute-0 virtqemud[186605]: Domain id=6 name='instance-00000008' uuid=8937fe36-e7a1-49d6-947a-6627f5bbfffb is tainted: custom-monitor
Dec 05 06:18:25 compute-0 podman[209679]: 2025-12-05 06:18:25.470687769 +0000 UTC m=+0.049599788 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 05 06:18:25 compute-0 podman[209677]: 2025-12-05 06:18:25.470739355 +0000 UTC m=+0.051439936 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:18:25 compute-0 podman[209678]: 2025-12-05 06:18:25.47202376 +0000 UTC m=+0.050676020 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-type=git)
Dec 05 06:18:25 compute-0 nova_compute[186329]: 2025-12-05 06:18:25.888 186333 INFO nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:18:26 compute-0 nova_compute[186329]: 2025-12-05 06:18:26.596 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:26 compute-0 nova_compute[186329]: 2025-12-05 06:18:26.893 186333 INFO nova.virt.libvirt.driver [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:18:26 compute-0 nova_compute[186329]: 2025-12-05 06:18:26.897 186333 DEBUG nova.compute.manager [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:18:27 compute-0 nova_compute[186329]: 2025-12-05 06:18:27.404 186333 DEBUG nova.objects.instance [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:18:28 compute-0 nova_compute[186329]: 2025-12-05 06:18:28.416 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:28 compute-0 nova_compute[186329]: 2025-12-05 06:18:28.689 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:29 compute-0 nova_compute[186329]: 2025-12-05 06:18:29.487 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:29 compute-0 nova_compute[186329]: 2025-12-05 06:18:29.487 186333 WARNING neutronclient.v2_0.client [None req-09616dee-45f4-498d-9532-b203435d1900 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:29.501 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:29.501 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:29.502 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:29 compute-0 podman[196599]: time="2025-12-05T06:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:18:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:18:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: ERROR   06:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: ERROR   06:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: ERROR   06:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: ERROR   06:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: ERROR   06:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:18:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:18:31 compute-0 nova_compute[186329]: 2025-12-05 06:18:31.596 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:33 compute-0 nova_compute[186329]: 2025-12-05 06:18:33.690 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:36 compute-0 nova_compute[186329]: 2025-12-05 06:18:36.598 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:38 compute-0 nova_compute[186329]: 2025-12-05 06:18:38.691 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.756 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.756 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.757 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.757 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.757 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:39 compute-0 nova_compute[186329]: 2025-12-05 06:18:39.763 186333 INFO nova.compute.manager [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Terminating instance
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.271 186333 DEBUG nova.compute.manager [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:18:40 compute-0 kernel: tapc00622c0-b6 (unregistering): left promiscuous mode
Dec 05 06:18:40 compute-0 NetworkManager[55434]: <info>  [1764915520.3000] device (tapc00622c0-b6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.306 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.310 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 ovn_controller[95223]: 2025-12-05T06:18:40Z|00087|binding|INFO|Releasing lport c00622c0-b6d3-4225-8e33-44b59cb8a42a from this chassis (sb_readonly=0)
Dec 05 06:18:40 compute-0 ovn_controller[95223]: 2025-12-05T06:18:40Z|00088|binding|INFO|Setting lport c00622c0-b6d3-4225-8e33-44b59cb8a42a down in Southbound
Dec 05 06:18:40 compute-0 ovn_controller[95223]: 2025-12-05T06:18:40Z|00089|binding|INFO|Removing iface tapc00622c0-b6 ovn-installed in OVS
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.319 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d2:4d:71 10.100.0.4'], port_security=['fa:16:3e:d2:4d:71 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '34c83f7e-4b49-4a20-b482-394fe5b63c68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=c00622c0-b6d3-4225-8e33-44b59cb8a42a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.319 104041 INFO neutron.agent.ovn.metadata.agent [-] Port c00622c0-b6d3-4225-8e33-44b59cb8a42a in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.320 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.321 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.333 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a19606a3-2506-4da5-ae18-01b008b8f315]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec 05 06:18:40 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000009.scope: Consumed 11.780s CPU time.
Dec 05 06:18:40 compute-0 systemd-machined[152967]: Machine qemu-5-instance-00000009 terminated.
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.354 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[907bed3a-1186-47c9-9ed0-2deba1af1616]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.356 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[d047bb94-505e-4d96-8f35-81c0fcd288d5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.373 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[baa24951-d57d-4a3c-b5b8-5e2276328373]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.386 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[577469b5-b58a-4405-aaa4-1cbcc20dd988]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 27, 'tx_packets': 7, 'rx_bytes': 1414, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 27, 'tx_packets': 7, 'rx_bytes': 1414, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334542, 'reachable_time': 40776, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209740, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.397 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0762ff52-58ce-4f3e-a12c-58c31f77f810]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334550, 'tstamp': 334550}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209741, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 334551, 'tstamp': 334551}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209741, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.398 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.399 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.402 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.403 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.403 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.403 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.403 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:18:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:40.404 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8351b06f-47b7-4fe2-bfbd-2c16fd821295]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.436 186333 DEBUG nova.compute.manager [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.436 186333 DEBUG oslo_concurrency.lockutils [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.436 186333 DEBUG oslo_concurrency.lockutils [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.437 186333 DEBUG oslo_concurrency.lockutils [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.437 186333 DEBUG nova.compute.manager [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] No waiting events found dispatching network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.437 186333 DEBUG nova.compute.manager [req-aff3533f-4424-4295-93bd-ddc24ef6e457 req-577829f0-667b-40a5-838d-f267aa29d71e fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.483 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.486 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.504 186333 INFO nova.virt.libvirt.driver [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Instance destroyed successfully.
Dec 05 06:18:40 compute-0 nova_compute[186329]: 2025-12-05 06:18:40.505 186333 DEBUG nova.objects.instance [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'resources' on Instance uuid 34c83f7e-4b49-4a20-b482-394fe5b63c68 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.008 186333 DEBUG nova.virt.libvirt.vif [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:17:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-194938820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-194938820',id=9,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:17:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-ly8dat25',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:17:48Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=34c83f7e-4b49-4a20-b482-394fe5b63c68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.009 186333 DEBUG nova.network.os_vif_util [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "address": "fa:16:3e:d2:4d:71", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00622c0-b6", "ovs_interfaceid": "c00622c0-b6d3-4225-8e33-44b59cb8a42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.009 186333 DEBUG nova.network.os_vif_util [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.009 186333 DEBUG os_vif [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.010 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.011 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc00622c0-b6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.012 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.013 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.013 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=530f62a2-9864-4522-8781-8870e95db287) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.014 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.016 186333 INFO os_vif [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d2:4d:71,bridge_name='br-int',has_traffic_filtering=True,id=c00622c0-b6d3-4225-8e33-44b59cb8a42a,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00622c0-b6')
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.016 186333 INFO nova.virt.libvirt.driver [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Deleting instance files /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68_del
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.016 186333 INFO nova.virt.libvirt.driver [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Deletion of /var/lib/nova/instances/34c83f7e-4b49-4a20-b482-394fe5b63c68_del complete
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.524 186333 INFO nova.compute.manager [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Took 1.25 seconds to destroy the instance on the hypervisor.
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.525 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.525 186333 DEBUG nova.compute.manager [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.525 186333 DEBUG nova.network.neutron [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:18:41 compute-0 nova_compute[186329]: 2025-12-05 06:18:41.525 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.489 186333 DEBUG nova.compute.manager [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.489 186333 DEBUG oslo_concurrency.lockutils [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.489 186333 DEBUG oslo_concurrency.lockutils [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.489 186333 DEBUG oslo_concurrency.lockutils [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.490 186333 DEBUG nova.compute.manager [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] No waiting events found dispatching network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.490 186333 DEBUG nova.compute.manager [req-b3e03e53-cf02-42f7-9280-8b6b5c70fd01 req-e0c21033-42df-4479-b7d4-1dec7bfe781d fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-unplugged-c00622c0-b6d3-4225-8e33-44b59cb8a42a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.499 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:42 compute-0 nova_compute[186329]: 2025-12-05 06:18:42.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:43 compute-0 nova_compute[186329]: 2025-12-05 06:18:43.692 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:44 compute-0 nova_compute[186329]: 2025-12-05 06:18:44.569 186333 DEBUG nova.compute.manager [req-5177a8fb-c6ac-4bc3-a92d-9658d5a50f75 req-f7d32641-bee4-4b15-9cae-69db6e4e99bd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Received event network-vif-deleted-c00622c0-b6d3-4225-8e33-44b59cb8a42a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:44 compute-0 nova_compute[186329]: 2025-12-05 06:18:44.569 186333 INFO nova.compute.manager [req-5177a8fb-c6ac-4bc3-a92d-9658d5a50f75 req-f7d32641-bee4-4b15-9cae-69db6e4e99bd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Neutron deleted interface c00622c0-b6d3-4225-8e33-44b59cb8a42a; detaching it from the instance and deleting it from the info cache
Dec 05 06:18:44 compute-0 nova_compute[186329]: 2025-12-05 06:18:44.569 186333 DEBUG nova.network.neutron [req-5177a8fb-c6ac-4bc3-a92d-9658d5a50f75 req-f7d32641-bee4-4b15-9cae-69db6e4e99bd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:45 compute-0 nova_compute[186329]: 2025-12-05 06:18:45.013 186333 DEBUG nova.network.neutron [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:45 compute-0 nova_compute[186329]: 2025-12-05 06:18:45.073 186333 DEBUG nova.compute.manager [req-5177a8fb-c6ac-4bc3-a92d-9658d5a50f75 req-f7d32641-bee4-4b15-9cae-69db6e4e99bd fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Detach interface failed, port_id=c00622c0-b6d3-4225-8e33-44b59cb8a42a, reason: Instance 34c83f7e-4b49-4a20-b482-394fe5b63c68 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:18:45 compute-0 nova_compute[186329]: 2025-12-05 06:18:45.517 186333 INFO nova.compute.manager [-] [instance: 34c83f7e-4b49-4a20-b482-394fe5b63c68] Took 3.99 seconds to deallocate network for instance.
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.014 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.033 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.033 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.090 186333 DEBUG nova.compute.provider_tree [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.596 186333 DEBUG nova.scheduler.client.report [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:18:46 compute-0 nova_compute[186329]: 2025-12-05 06:18:46.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:47 compute-0 nova_compute[186329]: 2025-12-05 06:18:47.103 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:47 compute-0 nova_compute[186329]: 2025-12-05 06:18:47.118 186333 INFO nova.scheduler.client.report [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Deleted allocations for instance 34c83f7e-4b49-4a20-b482-394fe5b63c68
Dec 05 06:18:47 compute-0 podman[209759]: 2025-12-05 06:18:47.481468195 +0000 UTC m=+0.065570064 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:18:47 compute-0 podman[209760]: 2025-12-05 06:18:47.486522337 +0000 UTC m=+0.070153481 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:18:47 compute-0 nova_compute[186329]: 2025-12-05 06:18:47.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:47 compute-0 nova_compute[186329]: 2025-12-05 06:18:47.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.137 186333 DEBUG oslo_concurrency.lockutils [None req-fb07f896-d652-4191-bbe7-bf52fbfb4a65 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "34c83f7e-4b49-4a20-b482-394fe5b63c68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.381s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.217 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:18:48 compute-0 nova_compute[186329]: 2025-12-05 06:18:48.693 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.244 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.287 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.287 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.329 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.515 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.516 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.529 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.530 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5685MB free_disk=73.14015197753906GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.530 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:49 compute-0 nova_compute[186329]: 2025-12-05 06:18:49.531 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.167 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.168 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.168 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.168 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.169 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.176 186333 INFO nova.compute.manager [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Terminating instance
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.684 186333 DEBUG nova.compute.manager [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:18:50 compute-0 kernel: tapc5f3cbbb-36 (unregistering): left promiscuous mode
Dec 05 06:18:50 compute-0 NetworkManager[55434]: <info>  [1764915530.7102] device (tapc5f3cbbb-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:18:50 compute-0 ovn_controller[95223]: 2025-12-05T06:18:50Z|00090|binding|INFO|Releasing lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 from this chassis (sb_readonly=0)
Dec 05 06:18:50 compute-0 ovn_controller[95223]: 2025-12-05T06:18:50Z|00091|binding|INFO|Setting lport c5f3cbbb-3662-4096-8009-80a6ae4ad778 down in Southbound
Dec 05 06:18:50 compute-0 ovn_controller[95223]: 2025-12-05T06:18:50Z|00092|binding|INFO|Removing iface tapc5f3cbbb-36 ovn-installed in OVS
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.716 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.724 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:3d:38 10.100.0.13'], port_security=['fa:16:3e:23:3d:38 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8937fe36-e7a1-49d6-947a-6627f5bbfffb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=c5f3cbbb-3662-4096-8009-80a6ae4ad778) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.725 104041 INFO neutron.agent.ovn.metadata.agent [-] Port c5f3cbbb-3662-4096-8009-80a6ae4ad778 in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.726 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.728 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[18fd2cb7-f9b1-4e3d-b79b-c0446a7573c9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.728 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 namespace which is not needed anymore
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.731 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:50 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 05 06:18:50 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 2.267s CPU time.
Dec 05 06:18:50 compute-0 systemd-machined[152967]: Machine qemu-6-instance-00000008 terminated.
Dec 05 06:18:50 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [NOTICE]   (209469) : haproxy version is 3.0.5-8e879a5
Dec 05 06:18:50 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [NOTICE]   (209469) : path to executable is /usr/sbin/haproxy
Dec 05 06:18:50 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [WARNING]  (209469) : Exiting Master process...
Dec 05 06:18:50 compute-0 podman[209834]: 2025-12-05 06:18:50.813573886 +0000 UTC m=+0.022368008 container kill 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:18:50 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [ALERT]    (209469) : Current worker (209471) exited with code 143 (Terminated)
Dec 05 06:18:50 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[209465]: [WARNING]  (209469) : All workers exited. Exiting... (0)
Dec 05 06:18:50 compute-0 systemd[1]: libpod-579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3.scope: Deactivated successfully.
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.827 186333 DEBUG nova.compute.manager [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Received event network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.828 186333 DEBUG oslo_concurrency.lockutils [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.828 186333 DEBUG oslo_concurrency.lockutils [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.828 186333 DEBUG oslo_concurrency.lockutils [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.829 186333 DEBUG nova.compute.manager [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] No waiting events found dispatching network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.829 186333 DEBUG nova.compute.manager [req-7d1de61c-e6ae-441f-98d6-b1499c771338 req-7556cbe0-f3a7-4586-8774-eed816d85341 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Received event network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:18:50 compute-0 podman[209848]: 2025-12-05 06:18:50.848070425 +0000 UTC m=+0.016986539 container died 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:18:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-6295eea2b9baea2078ea4e8cbceb523233f475adf88d43760b474a6a2b597cd8-merged.mount: Deactivated successfully.
Dec 05 06:18:50 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3-userdata-shm.mount: Deactivated successfully.
Dec 05 06:18:50 compute-0 podman[209848]: 2025-12-05 06:18:50.868971712 +0000 UTC m=+0.037887817 container cleanup 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202)
Dec 05 06:18:50 compute-0 systemd[1]: libpod-conmon-579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3.scope: Deactivated successfully.
Dec 05 06:18:50 compute-0 podman[209847]: 2025-12-05 06:18:50.876721604 +0000 UTC m=+0.044647577 container remove 579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202)
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.880 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[01075c82-8b28-4811-88c1-4d62a3693735]: (4, ("Fri Dec  5 06:18:50 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 (579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3)\n579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3\nFri Dec  5 06:18:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 (579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3)\n579011dab85dd619f6222216aa93257972323e19131c63c3e2a3769885264cf3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.880 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ba29e8ef-525b-4eec-9924-d2af7bd34b8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.881 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.881 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba60bc7-ae6b-4334-af0b-00a2b4e4cc78]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.882 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.883 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:50 compute-0 kernel: tapdf5a1a6f-30: left promiscuous mode
Dec 05 06:18:50 compute-0 NetworkManager[55434]: <info>  [1764915530.8991] manager: (tapc5f3cbbb-36): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.899 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.902 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9a73a45c-b0f8-4c9f-a7eb-225a31f947ef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.912 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac2835c-8557-47e2-9c02-0ec0066ba331]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.912 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3a35bebd-be45-4e33-8a4f-304e33014d9c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.926 186333 INFO nova.virt.libvirt.driver [-] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Instance destroyed successfully.
Dec 05 06:18:50 compute-0 nova_compute[186329]: 2025-12-05 06:18:50.926 186333 DEBUG nova.objects.instance [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'resources' on Instance uuid 8937fe36-e7a1-49d6-947a-6627f5bbfffb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.925 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ea29cf-d73f-4873-a850-2423a1b2436a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 334537, 'reachable_time': 34985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209885, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:50 compute-0 systemd[1]: run-netns-ovnmeta\x2ddf5a1a6f\x2d32f3\x2d42ca\x2d8c18\x2d60ea4ce9d923.mount: Deactivated successfully.
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.929 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:18:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:18:50.929 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cf8215-7fc0-40ad-8d76-7442e11e7d1b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.015 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.431 186333 DEBUG nova.virt.libvirt.vif [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:17:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1095450282',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1095450282',id=8,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:17:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-0dzsfd99',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:18:27Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=8937fe36-e7a1-49d6-947a-6627f5bbfffb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.431 186333 DEBUG nova.network.os_vif_util [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "address": "fa:16:3e:23:3d:38", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5f3cbbb-36", "ovs_interfaceid": "c5f3cbbb-3662-4096-8009-80a6ae4ad778", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.432 186333 DEBUG nova.network.os_vif_util [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.432 186333 DEBUG os_vif [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.433 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.434 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5f3cbbb-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.435 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.437 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.438 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.438 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=003504fb-80c0-4474-a9fa-bbce4a024f1b) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.438 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.439 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.441 186333 INFO os_vif [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:3d:38,bridge_name='br-int',has_traffic_filtering=True,id=c5f3cbbb-3662-4096-8009-80a6ae4ad778,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5f3cbbb-36')
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.441 186333 INFO nova.virt.libvirt.driver [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Deleting instance files /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb_del
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.441 186333 INFO nova.virt.libvirt.driver [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Deletion of /var/lib/nova/instances/8937fe36-e7a1-49d6-947a-6627f5bbfffb_del complete
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.566 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.567 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 8937fe36-e7a1-49d6-947a-6627f5bbfffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.567 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.567 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:18:49 up 56 min,  0 user,  load average: 0.19, 0.23, 0.32\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_72c4ee5cc96a42b99210abaf8ae6fcc3': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.606 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.949 186333 INFO nova.compute.manager [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.950 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.950 186333 DEBUG nova.compute.manager [-] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.950 186333 DEBUG nova.network.neutron [-] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:18:51 compute-0 nova_compute[186329]: 2025-12-05 06:18:51.950 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.110 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.489 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.616 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.616 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.736 186333 DEBUG nova.compute.manager [req-9d1d9731-04da-4d4e-89a5-80604546b1be req-2b43688f-5c26-4efa-987e-da61a54ac1b6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Received event network-vif-deleted-c5f3cbbb-3662-4096-8009-80a6ae4ad778 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.736 186333 INFO nova.compute.manager [req-9d1d9731-04da-4d4e-89a5-80604546b1be req-2b43688f-5c26-4efa-987e-da61a54ac1b6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Neutron deleted interface c5f3cbbb-3662-4096-8009-80a6ae4ad778; detaching it from the instance and deleting it from the info cache
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.736 186333 DEBUG nova.network.neutron [req-9d1d9731-04da-4d4e-89a5-80604546b1be req-2b43688f-5c26-4efa-987e-da61a54ac1b6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG nova.compute.manager [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Received event network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG oslo_concurrency.lockutils [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG oslo_concurrency.lockutils [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG oslo_concurrency.lockutils [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG nova.compute.manager [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] No waiting events found dispatching network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:18:52 compute-0 nova_compute[186329]: 2025-12-05 06:18:52.871 186333 DEBUG nova.compute.manager [req-53d192c0-f761-42ec-91df-f78f3291a819 req-cfba7fda-e154-4d0c-9ffe-4ace0c36d7c5 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Received event network-vif-unplugged-c5f3cbbb-3662-4096-8009-80a6ae4ad778 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.198 186333 DEBUG nova.network.neutron [-] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.240 186333 DEBUG nova.compute.manager [req-9d1d9731-04da-4d4e-89a5-80604546b1be req-2b43688f-5c26-4efa-987e-da61a54ac1b6 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Detach interface failed, port_id=c5f3cbbb-3662-4096-8009-80a6ae4ad778, reason: Instance 8937fe36-e7a1-49d6-947a-6627f5bbfffb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.617 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.617 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.617 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.617 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.618 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.694 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:53 compute-0 nova_compute[186329]: 2025-12-05 06:18:53.704 186333 INFO nova.compute.manager [-] [instance: 8937fe36-e7a1-49d6-947a-6627f5bbfffb] Took 1.75 seconds to deallocate network for instance.
Dec 05 06:18:54 compute-0 nova_compute[186329]: 2025-12-05 06:18:54.215 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:18:54 compute-0 nova_compute[186329]: 2025-12-05 06:18:54.216 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:18:54 compute-0 nova_compute[186329]: 2025-12-05 06:18:54.256 186333 DEBUG nova.compute.provider_tree [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:18:54 compute-0 nova_compute[186329]: 2025-12-05 06:18:54.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:18:54 compute-0 nova_compute[186329]: 2025-12-05 06:18:54.760 186333 DEBUG nova.scheduler.client.report [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:18:55 compute-0 nova_compute[186329]: 2025-12-05 06:18:55.266 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.050s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:55 compute-0 nova_compute[186329]: 2025-12-05 06:18:55.283 186333 INFO nova.scheduler.client.report [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Deleted allocations for instance 8937fe36-e7a1-49d6-947a-6627f5bbfffb
Dec 05 06:18:56 compute-0 nova_compute[186329]: 2025-12-05 06:18:56.300 186333 DEBUG oslo_concurrency.lockutils [None req-9c27ccbb-08a5-4cc9-9e39-0db7618cf4db 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "8937fe36-e7a1-49d6-947a-6627f5bbfffb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.132s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:18:56 compute-0 nova_compute[186329]: 2025-12-05 06:18:56.439 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:56 compute-0 podman[209890]: 2025-12-05 06:18:56.458521956 +0000 UTC m=+0.043000248 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:18:56 compute-0 podman[209891]: 2025-12-05 06:18:56.467386113 +0000 UTC m=+0.049359305 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=)
Dec 05 06:18:56 compute-0 podman[209892]: 2025-12-05 06:18:56.469209491 +0000 UTC m=+0.050891987 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 05 06:18:58 compute-0 nova_compute[186329]: 2025-12-05 06:18:58.695 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:18:59 compute-0 podman[196599]: time="2025-12-05T06:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:18:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:18:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2583 "" "Go-http-client/1.1"
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: ERROR   06:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: ERROR   06:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: ERROR   06:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: ERROR   06:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: ERROR   06:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:19:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:19:01 compute-0 nova_compute[186329]: 2025-12-05 06:19:01.439 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:03 compute-0 nova_compute[186329]: 2025-12-05 06:19:03.696 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:06 compute-0 nova_compute[186329]: 2025-12-05 06:19:06.441 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:08 compute-0 nova_compute[186329]: 2025-12-05 06:19:08.697 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:11 compute-0 nova_compute[186329]: 2025-12-05 06:19:11.442 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:13 compute-0 nova_compute[186329]: 2025-12-05 06:19:13.700 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:16 compute-0 nova_compute[186329]: 2025-12-05 06:19:16.444 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:18 compute-0 podman[209944]: 2025-12-05 06:19:18.452451951 +0000 UTC m=+0.036366056 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:19:18 compute-0 podman[209943]: 2025-12-05 06:19:18.47347798 +0000 UTC m=+0.058490209 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:19:18 compute-0 nova_compute[186329]: 2025-12-05 06:19:18.701 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:21 compute-0 nova_compute[186329]: 2025-12-05 06:19:21.445 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:23 compute-0 nova_compute[186329]: 2025-12-05 06:19:23.703 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:26 compute-0 nova_compute[186329]: 2025-12-05 06:19:26.375 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:26 compute-0 nova_compute[186329]: 2025-12-05 06:19:26.376 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:26 compute-0 nova_compute[186329]: 2025-12-05 06:19:26.447 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:26 compute-0 nova_compute[186329]: 2025-12-05 06:19:26.878 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:19:27 compute-0 nova_compute[186329]: 2025-12-05 06:19:27.418 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:27 compute-0 nova_compute[186329]: 2025-12-05 06:19:27.418 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:27 compute-0 nova_compute[186329]: 2025-12-05 06:19:27.425 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:19:27 compute-0 nova_compute[186329]: 2025-12-05 06:19:27.425 186333 INFO nova.compute.claims [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:19:27 compute-0 podman[209989]: 2025-12-05 06:19:27.473440109 +0000 UTC m=+0.048525545 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:19:27 compute-0 podman[209987]: 2025-12-05 06:19:27.493442474 +0000 UTC m=+0.071551100 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 05 06:19:27 compute-0 podman[209988]: 2025-12-05 06:19:27.494795157 +0000 UTC m=+0.072878746 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9)
Dec 05 06:19:28 compute-0 nova_compute[186329]: 2025-12-05 06:19:28.479 186333 DEBUG nova.compute.provider_tree [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:19:28 compute-0 nova_compute[186329]: 2025-12-05 06:19:28.705 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:28 compute-0 nova_compute[186329]: 2025-12-05 06:19:28.983 186333 DEBUG nova.scheduler.client.report [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.490 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.490 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:19:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:29.502 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:29.502 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:29.502 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:29 compute-0 podman[196599]: time="2025-12-05T06:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:19:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:19:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.997 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.998 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.998 186333 WARNING neutronclient.v2_0.client [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:29 compute-0 nova_compute[186329]: 2025-12-05 06:19:29.998 186333 WARNING neutronclient.v2_0.client [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:30 compute-0 nova_compute[186329]: 2025-12-05 06:19:30.503 186333 INFO nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:19:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:30.725 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:19:30 compute-0 nova_compute[186329]: 2025-12-05 06:19:30.726 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:30.726 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:19:30 compute-0 nova_compute[186329]: 2025-12-05 06:19:30.809 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Successfully created port: de6bc18e-8fd5-4a3e-ac7d-36278d4793df _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.007 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: ERROR   06:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: ERROR   06:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: ERROR   06:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: ERROR   06:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: ERROR   06:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:19:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.449 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.623 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Successfully updated port: de6bc18e-8fd5-4a3e-ac7d-36278d4793df _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.665 186333 DEBUG nova.compute.manager [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-changed-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.665 186333 DEBUG nova.compute.manager [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Refreshing instance network info cache due to event network-changed-de6bc18e-8fd5-4a3e-ac7d-36278d4793df. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.665 186333 DEBUG oslo_concurrency.lockutils [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.666 186333 DEBUG oslo_concurrency.lockutils [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:19:31 compute-0 nova_compute[186329]: 2025-12-05 06:19:31.666 186333 DEBUG nova.network.neutron [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Refreshing network info cache for port de6bc18e-8fd5-4a3e-ac7d-36278d4793df _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.020 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.022 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.022 186333 INFO nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Creating image(s)
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.023 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.023 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.024 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.024 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.027 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.033 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.076 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.077 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.077 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.078 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.080 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.081 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.121 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.122 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.127 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.142 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.143 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.065s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.143 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.169 186333 WARNING neutronclient.v2_0.client [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.185 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.186 186333 DEBUG nova.virt.disk.api [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Checking if we can resize image /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.186 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.231 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.232 186333 DEBUG nova.virt.disk.api [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Cannot resize image /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.233 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.233 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Ensure instance console log exists: /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.233 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.234 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.234 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.495 186333 DEBUG nova.network.neutron [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:19:32 compute-0 nova_compute[186329]: 2025-12-05 06:19:32.626 186333 DEBUG nova.network.neutron [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:19:33 compute-0 nova_compute[186329]: 2025-12-05 06:19:33.132 186333 DEBUG oslo_concurrency.lockutils [req-44e1e9ec-d76f-4178-8cde-33dc7fea68ac req-f705a35d-0544-484f-9b73-4b682a42b141 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:19:33 compute-0 nova_compute[186329]: 2025-12-05 06:19:33.133 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquired lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:19:33 compute-0 nova_compute[186329]: 2025-12-05 06:19:33.133 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:19:33 compute-0 nova_compute[186329]: 2025-12-05 06:19:33.708 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:34 compute-0 nova_compute[186329]: 2025-12-05 06:19:34.503 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:19:34 compute-0 nova_compute[186329]: 2025-12-05 06:19:34.685 186333 WARNING neutronclient.v2_0.client [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:34 compute-0 nova_compute[186329]: 2025-12-05 06:19:34.812 186333 DEBUG nova.network.neutron [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Updating instance_info_cache with network_info: [{"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.317 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Releasing lock "refresh_cache-2a160ef8-a2f1-4959-a154-f46101bf8277" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.317 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance network_info: |[{"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.319 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Start _get_guest_xml network_info=[{"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.323 186333 WARNING nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.324 186333 DEBUG nova.virt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-308361040', uuid='2a160ef8-a2f1-4959-a154-f46101bf8277'), owner=OwnerMeta(userid='6e966152abb6429a8d2dc82faf5464b5', username='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin', projectid='72c4ee5cc96a42b99210abaf8ae6fcc3', projectname='tempest-TestExecuteHostMaintenanceStrategy-1144698866'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915575.3247793) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.330 186333 DEBUG nova.virt.libvirt.host [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.330 186333 DEBUG nova.virt.libvirt.host [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.333 186333 DEBUG nova.virt.libvirt.host [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.333 186333 DEBUG nova.virt.libvirt.host [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.335 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.335 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.335 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.335 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.336 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.336 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.336 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.336 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.337 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.337 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.337 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.337 186333 DEBUG nova.virt.hardware [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.341 186333 DEBUG nova.virt.libvirt.vif [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-308361040',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-308361040',id=11,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-cl0afaj0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:19:31Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=2a160ef8-a2f1-4959-a154-f46101bf8277,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.341 186333 DEBUG nova.network.os_vif_util [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.342 186333 DEBUG nova.network.os_vif_util [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.343 186333 DEBUG nova.objects.instance [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2a160ef8-a2f1-4959-a154-f46101bf8277 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.848 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <uuid>2a160ef8-a2f1-4959-a154-f46101bf8277</uuid>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <name>instance-0000000b</name>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-308361040</nova:name>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:19:35</nova:creationTime>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:19:35 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:19:35 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:user uuid="6e966152abb6429a8d2dc82faf5464b5">tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin</nova:user>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:project uuid="72c4ee5cc96a42b99210abaf8ae6fcc3">tempest-TestExecuteHostMaintenanceStrategy-1144698866</nova:project>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         <nova:port uuid="de6bc18e-8fd5-4a3e-ac7d-36278d4793df">
Dec 05 06:19:35 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <system>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="serial">2a160ef8-a2f1-4959-a154-f46101bf8277</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="uuid">2a160ef8-a2f1-4959-a154-f46101bf8277</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </system>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <os>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </os>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <features>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </features>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.config"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:86:0b:00"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <target dev="tapde6bc18e-8f"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/console.log" append="off"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <video>
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </video>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:19:35 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:19:35 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:19:35 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:19:35 compute-0 nova_compute[186329]: </domain>
Dec 05 06:19:35 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.850 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Preparing to wait for external event network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.851 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.851 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.851 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.852 186333 DEBUG nova.virt.libvirt.vif [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-308361040',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-308361040',id=11,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-cl0afaj0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:19:31Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=2a160ef8-a2f1-4959-a154-f46101bf8277,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.852 186333 DEBUG nova.network.os_vif_util [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.853 186333 DEBUG nova.network.os_vif_util [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.853 186333 DEBUG os_vif [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.854 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.854 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.854 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.855 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.855 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2990419e-2203-51c2-a8e7-0fe3771753e5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.856 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.857 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.863 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.863 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde6bc18e-8f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.863 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapde6bc18e-8f, col_values=(('qos', UUID('2ab5cd15-222d-4269-a0ec-0e6972df3afa')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.864 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapde6bc18e-8f, col_values=(('external_ids', {'iface-id': 'de6bc18e-8fd5-4a3e-ac7d-36278d4793df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:0b:00', 'vm-uuid': '2a160ef8-a2f1-4959-a154-f46101bf8277'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.864 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 NetworkManager[55434]: <info>  [1764915575.8656] manager: (tapde6bc18e-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.866 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.869 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:35 compute-0 nova_compute[186329]: 2025-12-05 06:19:35.870 186333 INFO os_vif [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f')
Dec 05 06:19:37 compute-0 nova_compute[186329]: 2025-12-05 06:19:37.406 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:19:37 compute-0 nova_compute[186329]: 2025-12-05 06:19:37.407 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:19:37 compute-0 nova_compute[186329]: 2025-12-05 06:19:37.407 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] No VIF found with MAC fa:16:3e:86:0b:00, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:19:37 compute-0 nova_compute[186329]: 2025-12-05 06:19:37.408 186333 INFO nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Using config drive
Dec 05 06:19:37 compute-0 nova_compute[186329]: 2025-12-05 06:19:37.915 186333 WARNING neutronclient.v2_0.client [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.595 186333 INFO nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Creating config drive at /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.config
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.599 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp58wdgowu execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.709 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.719 186333 DEBUG oslo_concurrency.processutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp58wdgowu" returned: 0 in 0.120s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:38 compute-0 kernel: tapde6bc18e-8f: entered promiscuous mode
Dec 05 06:19:38 compute-0 NetworkManager[55434]: <info>  [1764915578.7675] manager: (tapde6bc18e-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Dec 05 06:19:38 compute-0 ovn_controller[95223]: 2025-12-05T06:19:38Z|00093|binding|INFO|Claiming lport de6bc18e-8fd5-4a3e-ac7d-36278d4793df for this chassis.
Dec 05 06:19:38 compute-0 ovn_controller[95223]: 2025-12-05T06:19:38Z|00094|binding|INFO|de6bc18e-8fd5-4a3e-ac7d-36278d4793df: Claiming fa:16:3e:86:0b:00 10.100.0.14
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.769 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.773 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:0b:00 10.100.0.14'], port_security=['fa:16:3e:86:0b:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a160ef8-a2f1-4959-a154-f46101bf8277', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=de6bc18e-8fd5-4a3e-ac7d-36278d4793df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.775 104041 INFO neutron.agent.ovn.metadata.agent [-] Port de6bc18e-8fd5-4a3e-ac7d-36278d4793df in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 bound to our chassis
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.776 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:19:38 compute-0 ovn_controller[95223]: 2025-12-05T06:19:38Z|00095|binding|INFO|Setting lport de6bc18e-8fd5-4a3e-ac7d-36278d4793df ovn-installed in OVS
Dec 05 06:19:38 compute-0 ovn_controller[95223]: 2025-12-05T06:19:38Z|00096|binding|INFO|Setting lport de6bc18e-8fd5-4a3e-ac7d-36278d4793df up in Southbound
Dec 05 06:19:38 compute-0 nova_compute[186329]: 2025-12-05 06:19:38.786 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.788 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd56198-46e7-4172-a7af-e6c778560692]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.789 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf5a1a6f-31 in ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.791 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf5a1a6f-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.791 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[bf951c96-ce14-4f87-8e6a-2c3a406e982b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.791 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3177f698-e1f9-4f8f-9f8e-86b38276012c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 systemd-udevd[210079]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.799 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[1373abd5-1616-4c92-b627-da90d94ad189]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.803 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[139e55fa-b67c-43cd-b864-b1b0bdad45ff]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 systemd-machined[152967]: New machine qemu-7-instance-0000000b.
Dec 05 06:19:38 compute-0 NetworkManager[55434]: <info>  [1764915578.8132] device (tapde6bc18e-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:19:38 compute-0 NetworkManager[55434]: <info>  [1764915578.8138] device (tapde6bc18e-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:19:38 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.826 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[98d6982e-4f8c-4bb6-8b97-eb8e955066e5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.829 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[81105ce7-6dcb-41be-bdd4-ff898f29dc16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 NetworkManager[55434]: <info>  [1764915578.8301] manager: (tapdf5a1a6f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Dec 05 06:19:38 compute-0 systemd-udevd[210085]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.858 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc01875-662e-4917-bd67-71b44d2e41ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.861 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[179b35ae-bf94-42f0-be4a-546283c8986a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 NetworkManager[55434]: <info>  [1764915578.8827] device (tapdf5a1a6f-30): carrier: link connected
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.890 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[7040aa33-3115-4197-b7d8-2823e5fded85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.906 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8320c265-8dd7-41ee-a87e-c7cd0616227d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345741, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210103, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.921 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6559e0e2-2b51-4191-a84e-4d3caa11e4d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:f89d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345741, 'tstamp': 345741}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210104, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.934 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae76083-05bd-4682-a605-09bb0d0426ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345741, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210105, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.956 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1b4508c0-f40a-4bd3-bbe5-397f9f1e34b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:38.999 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1507c4dc-af70-4843-bd1e-5868f74c52ae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.000 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.000 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.000 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:39 compute-0 NetworkManager[55434]: <info>  [1764915579.0029] manager: (tapdf5a1a6f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.002 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:39 compute-0 kernel: tapdf5a1a6f-30: entered promiscuous mode
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.007 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:39 compute-0 ovn_controller[95223]: 2025-12-05T06:19:39Z|00097|binding|INFO|Releasing lport 11b2e7a6-c4ec-4f31-8535-807d9ce71179 from this chassis (sb_readonly=0)
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.008 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.019 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5ad6b9-1599-4e4b-a5ba-750f7bab53ce]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.020 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.020 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.020 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.020 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.020 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.021 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2ca19c3b-625f-44c2-a144-c68fdbb5b01c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.021 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.022 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c983fe3c-7596-4f31-9bad-6aa8ca69f0b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.022 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:19:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:39.022 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'env', 'PROCESS_TAG=haproxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:19:39 compute-0 podman[210140]: 2025-12-05 06:19:39.33910643 +0000 UTC m=+0.032544754 container create 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:19:39 compute-0 systemd[1]: Started libpod-conmon-36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b.scope.
Dec 05 06:19:39 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:19:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/707212482bd7660b4a0dffe9231f71cc7f60d767844b8a522d5388605f21c193/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:19:39 compute-0 podman[210140]: 2025-12-05 06:19:39.401816287 +0000 UTC m=+0.095254610 container init 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4)
Dec 05 06:19:39 compute-0 podman[210140]: 2025-12-05 06:19:39.406909427 +0000 UTC m=+0.100347751 container start 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:19:39 compute-0 podman[210140]: 2025-12-05 06:19:39.323681666 +0000 UTC m=+0.017120010 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:19:39 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [NOTICE]   (210156) : New worker (210158) forked
Dec 05 06:19:39 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [NOTICE]   (210156) : Loading success.
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.599 186333 DEBUG nova.compute.manager [req-d91d109a-426e-4686-92b2-645fa85d6178 req-aee234bb-8a04-4780-a1ea-e80d46aaf87f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.601 186333 DEBUG oslo_concurrency.lockutils [req-d91d109a-426e-4686-92b2-645fa85d6178 req-aee234bb-8a04-4780-a1ea-e80d46aaf87f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.601 186333 DEBUG oslo_concurrency.lockutils [req-d91d109a-426e-4686-92b2-645fa85d6178 req-aee234bb-8a04-4780-a1ea-e80d46aaf87f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.602 186333 DEBUG oslo_concurrency.lockutils [req-d91d109a-426e-4686-92b2-645fa85d6178 req-aee234bb-8a04-4780-a1ea-e80d46aaf87f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.602 186333 DEBUG nova.compute.manager [req-d91d109a-426e-4686-92b2-645fa85d6178 req-aee234bb-8a04-4780-a1ea-e80d46aaf87f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Processing event network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.603 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.606 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.609 186333 INFO nova.virt.libvirt.driver [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance spawned successfully.
Dec 05 06:19:39 compute-0 nova_compute[186329]: 2025-12-05 06:19:39.609 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.119 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.119 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.119 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.120 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.120 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.120 186333 DEBUG nova.virt.libvirt.driver [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.628 186333 INFO nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Took 8.61 seconds to spawn the instance on the hypervisor.
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.628 186333 DEBUG nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:19:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:19:40.730 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:19:40 compute-0 nova_compute[186329]: 2025-12-05 06:19:40.866 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.153 186333 INFO nova.compute.manager [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Took 13.77 seconds to build instance.
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.636 186333 DEBUG nova.compute.manager [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.636 186333 DEBUG oslo_concurrency.lockutils [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.636 186333 DEBUG oslo_concurrency.lockutils [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.636 186333 DEBUG oslo_concurrency.lockutils [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.636 186333 DEBUG nova.compute.manager [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] No waiting events found dispatching network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.637 186333 WARNING nova.compute.manager [req-ede19287-9c12-410f-b043-0c002933db36 req-54b92819-7ec5-4766-9ba1-f1725d39e231 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received unexpected event network-vif-plugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df for instance with vm_state active and task_state None.
Dec 05 06:19:41 compute-0 nova_compute[186329]: 2025-12-05 06:19:41.658 186333 DEBUG oslo_concurrency.lockutils [None req-ba8ca39b-caa2-49a5-a4b8-d1e9d6365fe9 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.282s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:43 compute-0 nova_compute[186329]: 2025-12-05 06:19:43.711 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:45 compute-0 nova_compute[186329]: 2025-12-05 06:19:45.868 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:47 compute-0 nova_compute[186329]: 2025-12-05 06:19:47.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:48 compute-0 nova_compute[186329]: 2025-12-05 06:19:48.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:49 compute-0 nova_compute[186329]: 2025-12-05 06:19:49.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:49 compute-0 nova_compute[186329]: 2025-12-05 06:19:49.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:49 compute-0 nova_compute[186329]: 2025-12-05 06:19:49.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:49 compute-0 nova_compute[186329]: 2025-12-05 06:19:49.225 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:19:49 compute-0 podman[210165]: 2025-12-05 06:19:49.291942176 +0000 UTC m=+0.037871104 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:19:49 compute-0 podman[210163]: 2025-12-05 06:19:49.316739362 +0000 UTC m=+0.063461540 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.253 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.307 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.308 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.360 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.547 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.548 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.562 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.563 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5725MB free_disk=73.1682357788086GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.563 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.563 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:19:50 compute-0 nova_compute[186329]: 2025-12-05 06:19:50.871 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:51 compute-0 ovn_controller[95223]: 2025-12-05T06:19:51Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:0b:00 10.100.0.14
Dec 05 06:19:51 compute-0 ovn_controller[95223]: 2025-12-05T06:19:51Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:0b:00 10.100.0.14
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.109 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.110 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 2a160ef8-a2f1-4959-a154-f46101bf8277 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.110 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.110 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:19:50 up 57 min,  0 user,  load average: 0.29, 0.24, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_72c4ee5cc96a42b99210abaf8ae6fcc3': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.159 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:19:52 compute-0 nova_compute[186329]: 2025-12-05 06:19:52.663 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.169 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.170 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.606s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.170 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.170 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.674 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.718 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.879 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Creating tmpfile /var/lib/nova/instances/tmpfe9zt88u to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.880 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:53 compute-0 nova_compute[186329]: 2025-12-05 06:19:53.882 186333 DEBUG nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfe9zt88u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:19:55 compute-0 nova_compute[186329]: 2025-12-05 06:19:55.872 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:55 compute-0 nova_compute[186329]: 2025-12-05 06:19:55.903 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:19:56 compute-0 nova_compute[186329]: 2025-12-05 06:19:56.717 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:56 compute-0 nova_compute[186329]: 2025-12-05 06:19:56.718 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:56 compute-0 nova_compute[186329]: 2025-12-05 06:19:56.718 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:56 compute-0 nova_compute[186329]: 2025-12-05 06:19:56.718 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:19:56 compute-0 nova_compute[186329]: 2025-12-05 06:19:56.718 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:19:57 compute-0 nova_compute[186329]: 2025-12-05 06:19:57.223 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:19:58 compute-0 podman[210230]: 2025-12-05 06:19:58.462427099 +0000 UTC m=+0.044628891 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:19:58 compute-0 podman[210229]: 2025-12-05 06:19:58.463694711 +0000 UTC m=+0.045948893 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, name=ubi9-minimal, vcs-type=git)
Dec 05 06:19:58 compute-0 podman[210228]: 2025-12-05 06:19:58.484634749 +0000 UTC m=+0.068434955 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:19:58 compute-0 nova_compute[186329]: 2025-12-05 06:19:58.720 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:19:59 compute-0 podman[196599]: time="2025-12-05T06:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:19:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:19:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3045 "" "Go-http-client/1.1"
Dec 05 06:20:00 compute-0 nova_compute[186329]: 2025-12-05 06:20:00.873 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:01 compute-0 nova_compute[186329]: 2025-12-05 06:20:01.206 186333 DEBUG nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfe9zt88u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2cc50d23-400e-4537-aba3-e0b30f79963a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: ERROR   06:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: ERROR   06:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: ERROR   06:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: ERROR   06:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: ERROR   06:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:20:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:20:02 compute-0 nova_compute[186329]: 2025-12-05 06:20:02.217 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:20:02 compute-0 nova_compute[186329]: 2025-12-05 06:20:02.217 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:20:02 compute-0 nova_compute[186329]: 2025-12-05 06:20:02.217 186333 DEBUG nova.network.neutron [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:20:02 compute-0 nova_compute[186329]: 2025-12-05 06:20:02.722 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:03 compute-0 nova_compute[186329]: 2025-12-05 06:20:03.722 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:03 compute-0 nova_compute[186329]: 2025-12-05 06:20:03.759 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:04 compute-0 nova_compute[186329]: 2025-12-05 06:20:04.568 186333 DEBUG nova.network.neutron [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Updating instance_info_cache with network_info: [{"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.073 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.082 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfe9zt88u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2cc50d23-400e-4537-aba3-e0b30f79963a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.082 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Creating instance directory: /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.082 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Creating disk.info with the contents: {'/var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk': 'qcow2', '/var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.083 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.083 186333 DEBUG nova.objects.instance [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2cc50d23-400e-4537-aba3-e0b30f79963a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.586 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.589 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.590 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.631 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.632 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.632 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.633 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.635 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.635 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.676 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.677 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.696 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.697 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.064s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.697 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.738 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.739 186333 DEBUG nova.virt.disk.api [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.739 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.780 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.781 186333 DEBUG nova.virt.disk.api [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.781 186333 DEBUG nova.objects.instance [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cc50d23-400e-4537-aba3-e0b30f79963a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:20:05 compute-0 nova_compute[186329]: 2025-12-05 06:20:05.876 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.285 186333 DEBUG nova.objects.base [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<2cc50d23-400e-4537-aba3-e0b30f79963a> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.286 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.303 186333 DEBUG oslo_concurrency.processutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a/disk.config 497664" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.303 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.305 186333 DEBUG nova.virt.libvirt.vif [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-215099084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-215099084',id=10,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:19:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-98t7n2q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:19:20Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=2cc50d23-400e-4537-aba3-e0b30f79963a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.305 186333 DEBUG nova.network.os_vif_util [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.306 186333 DEBUG nova.network.os_vif_util [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.306 186333 DEBUG os_vif [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.307 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.307 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.308 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.308 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.309 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5d199669-a5aa-5c45-8192-df5e314867fc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.312 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.314 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.314 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2957b826-c9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.314 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2957b826-c9, col_values=(('qos', UUID('2ee87e5a-d528-4b82-a6d0-3427744a473c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.315 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2957b826-c9, col_values=(('external_ids', {'iface-id': '2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:53:de', 'vm-uuid': '2cc50d23-400e-4537-aba3-e0b30f79963a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:06 compute-0 NetworkManager[55434]: <info>  [1764915606.3164] manager: (tap2957b826-c9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.315 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.318 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.320 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.320 186333 INFO os_vif [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9')
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.321 186333 DEBUG nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.321 186333 DEBUG nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfe9zt88u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2cc50d23-400e-4537-aba3-e0b30f79963a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.322 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:06 compute-0 nova_compute[186329]: 2025-12-05 06:20:06.505 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:07 compute-0 nova_compute[186329]: 2025-12-05 06:20:07.003 186333 DEBUG nova.network.neutron [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Port 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:20:07 compute-0 nova_compute[186329]: 2025-12-05 06:20:07.010 186333 DEBUG nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfe9zt88u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2cc50d23-400e-4537-aba3-e0b30f79963a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:20:08 compute-0 nova_compute[186329]: 2025-12-05 06:20:08.724 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:08 compute-0 ovn_controller[95223]: 2025-12-05T06:20:08Z|00098|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 06:20:10 compute-0 kernel: tap2957b826-c9: entered promiscuous mode
Dec 05 06:20:10 compute-0 NetworkManager[55434]: <info>  [1764915610.4151] manager: (tap2957b826-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Dec 05 06:20:10 compute-0 nova_compute[186329]: 2025-12-05 06:20:10.417 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:10 compute-0 ovn_controller[95223]: 2025-12-05T06:20:10Z|00099|binding|INFO|Claiming lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd for this additional chassis.
Dec 05 06:20:10 compute-0 ovn_controller[95223]: 2025-12-05T06:20:10Z|00100|binding|INFO|2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd: Claiming fa:16:3e:1f:53:de 10.100.0.4
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.422 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:53:de 10.100.0.4'], port_security=['fa:16:3e:1f:53:de 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2cc50d23-400e-4537-aba3-e0b30f79963a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.425 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.425 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:20:10 compute-0 ovn_controller[95223]: 2025-12-05T06:20:10Z|00101|binding|INFO|Setting lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd ovn-installed in OVS
Dec 05 06:20:10 compute-0 nova_compute[186329]: 2025-12-05 06:20:10.432 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:10 compute-0 nova_compute[186329]: 2025-12-05 06:20:10.433 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.438 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[264fd848-5636-4201-bb13-421e55b54613]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 systemd-udevd[210318]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:20:10 compute-0 systemd-machined[152967]: New machine qemu-8-instance-0000000a.
Dec 05 06:20:10 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Dec 05 06:20:10 compute-0 NetworkManager[55434]: <info>  [1764915610.4595] device (tap2957b826-c9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:20:10 compute-0 NetworkManager[55434]: <info>  [1764915610.4607] device (tap2957b826-c9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.460 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[08ca7353-427b-462f-83f8-ac2ce4e34bfc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.462 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3f6d79-581a-47d6-8138-3117c362bd97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.481 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[76dfcdf3-a387-4b02-a6c2-f15e13ee6856]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.493 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb9f243-17c5-41a7-b11d-fe6cd453111a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345741, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210326, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.506 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a9bcab-f00d-4441-9bb5-537284a68f2f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345751, 'tstamp': 345751}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210330, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345753, 'tstamp': 345753}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210330, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.507 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:10 compute-0 nova_compute[186329]: 2025-12-05 06:20:10.508 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:10 compute-0 nova_compute[186329]: 2025-12-05 06:20:10.509 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.511 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.511 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.511 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.512 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:20:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:10.513 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d668ba81-d62a-4b2a-ab89-f8ce19e08338]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:11 compute-0 nova_compute[186329]: 2025-12-05 06:20:11.316 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:12 compute-0 ovn_controller[95223]: 2025-12-05T06:20:12Z|00102|binding|INFO|Claiming lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd for this chassis.
Dec 05 06:20:12 compute-0 ovn_controller[95223]: 2025-12-05T06:20:12Z|00103|binding|INFO|2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd: Claiming fa:16:3e:1f:53:de 10.100.0.4
Dec 05 06:20:12 compute-0 ovn_controller[95223]: 2025-12-05T06:20:12Z|00104|binding|INFO|Setting lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd up in Southbound
Dec 05 06:20:13 compute-0 nova_compute[186329]: 2025-12-05 06:20:13.726 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:13 compute-0 nova_compute[186329]: 2025-12-05 06:20:13.795 186333 INFO nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Post operation of migration started
Dec 05 06:20:13 compute-0 nova_compute[186329]: 2025-12-05 06:20:13.795 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:14 compute-0 nova_compute[186329]: 2025-12-05 06:20:14.507 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:14 compute-0 nova_compute[186329]: 2025-12-05 06:20:14.508 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:14 compute-0 nova_compute[186329]: 2025-12-05 06:20:14.584 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:20:14 compute-0 nova_compute[186329]: 2025-12-05 06:20:14.584 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:20:14 compute-0 nova_compute[186329]: 2025-12-05 06:20:14.584 186333 DEBUG nova.network.neutron [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:20:15 compute-0 nova_compute[186329]: 2025-12-05 06:20:15.090 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:15 compute-0 nova_compute[186329]: 2025-12-05 06:20:15.716 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:15 compute-0 nova_compute[186329]: 2025-12-05 06:20:15.827 186333 DEBUG nova.network.neutron [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Updating instance_info_cache with network_info: [{"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.318 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.332 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-2cc50d23-400e-4537-aba3-e0b30f79963a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.844 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.844 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.845 186333 DEBUG oslo_concurrency.lockutils [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:16 compute-0 nova_compute[186329]: 2025-12-05 06:20:16.848 186333 INFO nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:20:16 compute-0 virtqemud[186605]: Domain id=8 name='instance-0000000a' uuid=2cc50d23-400e-4537-aba3-e0b30f79963a is tainted: custom-monitor
Dec 05 06:20:17 compute-0 nova_compute[186329]: 2025-12-05 06:20:17.852 186333 INFO nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:20:18 compute-0 nova_compute[186329]: 2025-12-05 06:20:18.727 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:18 compute-0 nova_compute[186329]: 2025-12-05 06:20:18.856 186333 INFO nova.virt.libvirt.driver [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:20:18 compute-0 nova_compute[186329]: 2025-12-05 06:20:18.859 186333 DEBUG nova.compute.manager [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:20:19 compute-0 nova_compute[186329]: 2025-12-05 06:20:19.366 186333 DEBUG nova.objects.instance [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:20:19 compute-0 podman[210353]: 2025-12-05 06:20:19.468528247 +0000 UTC m=+0.047979318 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:20:19 compute-0 podman[210352]: 2025-12-05 06:20:19.492576214 +0000 UTC m=+0.073801651 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 06:20:20 compute-0 nova_compute[186329]: 2025-12-05 06:20:20.377 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:20 compute-0 nova_compute[186329]: 2025-12-05 06:20:20.505 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:20 compute-0 nova_compute[186329]: 2025-12-05 06:20:20.505 186333 WARNING neutronclient.v2_0.client [None req-6daa89cf-d5ab-4f2e-937c-da1c108b14de e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:21 compute-0 nova_compute[186329]: 2025-12-05 06:20:21.320 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:23 compute-0 nova_compute[186329]: 2025-12-05 06:20:23.729 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:26 compute-0 nova_compute[186329]: 2025-12-05 06:20:26.322 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:28 compute-0 nova_compute[186329]: 2025-12-05 06:20:28.730 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:29 compute-0 podman[210400]: 2025-12-05 06:20:29.465436118 +0000 UTC m=+0.048815008 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 06:20:29 compute-0 podman[210402]: 2025-12-05 06:20:29.475009978 +0000 UTC m=+0.052647009 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:20:29 compute-0 podman[210401]: 2025-12-05 06:20:29.483460446 +0000 UTC m=+0.064563189 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Dec 05 06:20:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:29.503 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:29.503 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:29.503 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:29 compute-0 podman[196599]: time="2025-12-05T06:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:20:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:20:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3042 "" "Go-http-client/1.1"
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.495 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.496 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.496 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.496 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.496 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:30 compute-0 nova_compute[186329]: 2025-12-05 06:20:30.503 186333 INFO nova.compute.manager [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Terminating instance
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.011 186333 DEBUG nova.compute.manager [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:20:31 compute-0 kernel: tapde6bc18e-8f (unregistering): left promiscuous mode
Dec 05 06:20:31 compute-0 NetworkManager[55434]: <info>  [1764915631.0329] device (tapde6bc18e-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:20:31 compute-0 ovn_controller[95223]: 2025-12-05T06:20:31Z|00105|binding|INFO|Releasing lport de6bc18e-8fd5-4a3e-ac7d-36278d4793df from this chassis (sb_readonly=0)
Dec 05 06:20:31 compute-0 ovn_controller[95223]: 2025-12-05T06:20:31Z|00106|binding|INFO|Setting lport de6bc18e-8fd5-4a3e-ac7d-36278d4793df down in Southbound
Dec 05 06:20:31 compute-0 ovn_controller[95223]: 2025-12-05T06:20:31Z|00107|binding|INFO|Removing iface tapde6bc18e-8f ovn-installed in OVS
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.045 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.045 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:0b:00 10.100.0.14'], port_security=['fa:16:3e:86:0b:00 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '2a160ef8-a2f1-4959-a154-f46101bf8277', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=de6bc18e-8fd5-4a3e-ac7d-36278d4793df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.046 104041 INFO neutron.agent.ovn.metadata.agent [-] Port de6bc18e-8fd5-4a3e-ac7d-36278d4793df in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.047 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.058 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.062 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[75bbdbf7-43e0-4d72-92f9-26f0ca694e01]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec 05 06:20:31 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 12.139s CPU time.
Dec 05 06:20:31 compute-0 systemd-machined[152967]: Machine qemu-7-instance-0000000b terminated.
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.082 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[113f1548-32f6-4e40-b81a-35f48bf96d50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.084 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[3424c766-7ab8-4103-9a57-b6260a69605f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.103 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f287496f-c9fb-4ee3-846a-e6dd8e510ba3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.115 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[51b8c1ab-d638-4e8f-bad6-82d11c7144ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf5a1a6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:f8:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345741, 'reachable_time': 34079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210463, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.125 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[11230e98-9ea4-4fae-a402-ead1afea140c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345751, 'tstamp': 345751}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210464, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdf5a1a6f-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345753, 'tstamp': 345753}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210464, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.126 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.128 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.131 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.131 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf5a1a6f-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.131 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.132 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf5a1a6f-30, col_values=(('external_ids', {'iface-id': '11b2e7a6-c4ec-4f31-8535-807d9ce71179'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.132 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:20:31 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:31.133 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ba0f2c28-4e65-439c-9dc4-77c8fa298749]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID df5a1a6f-32f3-42ca-8c18-60ea4ce9d923\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.250 186333 INFO nova.virt.libvirt.driver [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Instance destroyed successfully.
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.250 186333 DEBUG nova.objects.instance [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'resources' on Instance uuid 2a160ef8-a2f1-4959-a154-f46101bf8277 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.323 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: ERROR   06:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: ERROR   06:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: ERROR   06:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: ERROR   06:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: ERROR   06:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:20:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.609 186333 DEBUG nova.compute.manager [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.609 186333 DEBUG oslo_concurrency.lockutils [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.609 186333 DEBUG oslo_concurrency.lockutils [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.610 186333 DEBUG oslo_concurrency.lockutils [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.610 186333 DEBUG nova.compute.manager [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] No waiting events found dispatching network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.610 186333 DEBUG nova.compute.manager [req-225f2eb5-e813-42b5-98a3-89a391b54415 req-01990bcb-6a32-4c3c-bb09-42f88f77f443 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.754 186333 DEBUG nova.virt.libvirt.vif [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:19:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-308361040',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-308361040',id=11,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:19:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-cl0afaj0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:19:40Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=2a160ef8-a2f1-4959-a154-f46101bf8277,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.755 186333 DEBUG nova.network.os_vif_util [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "address": "fa:16:3e:86:0b:00", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapde6bc18e-8f", "ovs_interfaceid": "de6bc18e-8fd5-4a3e-ac7d-36278d4793df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.755 186333 DEBUG nova.network.os_vif_util [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.755 186333 DEBUG os_vif [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.757 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.757 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde6bc18e-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.758 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.760 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.760 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.760 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2ab5cd15-222d-4269-a0ec-0e6972df3afa) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.761 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.761 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.763 186333 INFO os_vif [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:0b:00,bridge_name='br-int',has_traffic_filtering=True,id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde6bc18e-8f')
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.764 186333 INFO nova.virt.libvirt.driver [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Deleting instance files /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277_del
Dec 05 06:20:31 compute-0 nova_compute[186329]: 2025-12-05 06:20:31.764 186333 INFO nova.virt.libvirt.driver [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Deletion of /var/lib/nova/instances/2a160ef8-a2f1-4959-a154-f46101bf8277_del complete
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.273 186333 INFO nova.compute.manager [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.273 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.274 186333 DEBUG nova.compute.manager [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.274 186333 DEBUG nova.network.neutron [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.274 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.505 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:32.717 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:20:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:32.719 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:20:32 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:32.719 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.826 186333 DEBUG nova.compute.manager [req-572b0757-f5f6-4ab0-9ae3-cbdae0899084 req-3ff36409-9130-4133-aea7-b74da5f4e273 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-deleted-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.826 186333 INFO nova.compute.manager [req-572b0757-f5f6-4ab0-9ae3-cbdae0899084 req-3ff36409-9130-4133-aea7-b74da5f4e273 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Neutron deleted interface de6bc18e-8fd5-4a3e-ac7d-36278d4793df; detaching it from the instance and deleting it from the info cache
Dec 05 06:20:32 compute-0 nova_compute[186329]: 2025-12-05 06:20:32.826 186333 DEBUG nova.network.neutron [req-572b0757-f5f6-4ab0-9ae3-cbdae0899084 req-3ff36409-9130-4133-aea7-b74da5f4e273 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.287 186333 DEBUG nova.network.neutron [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.331 186333 DEBUG nova.compute.manager [req-572b0757-f5f6-4ab0-9ae3-cbdae0899084 req-3ff36409-9130-4133-aea7-b74da5f4e273 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Detach interface failed, port_id=de6bc18e-8fd5-4a3e-ac7d-36278d4793df, reason: Instance 2a160ef8-a2f1-4959-a154-f46101bf8277 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.652 186333 DEBUG nova.compute.manager [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.652 186333 DEBUG oslo_concurrency.lockutils [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.652 186333 DEBUG oslo_concurrency.lockutils [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.653 186333 DEBUG oslo_concurrency.lockutils [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.653 186333 DEBUG nova.compute.manager [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] No waiting events found dispatching network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.653 186333 DEBUG nova.compute.manager [req-2bc19a77-b6ad-48b5-89e3-aafde678a60a req-82595727-5df2-49ef-92d7-50edd9a74455 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Received event network-vif-unplugged-de6bc18e-8fd5-4a3e-ac7d-36278d4793df for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.732 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:33 compute-0 nova_compute[186329]: 2025-12-05 06:20:33.791 186333 INFO nova.compute.manager [-] [instance: 2a160ef8-a2f1-4959-a154-f46101bf8277] Took 1.52 seconds to deallocate network for instance.
Dec 05 06:20:34 compute-0 nova_compute[186329]: 2025-12-05 06:20:34.303 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:34 compute-0 nova_compute[186329]: 2025-12-05 06:20:34.303 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:34 compute-0 nova_compute[186329]: 2025-12-05 06:20:34.363 186333 DEBUG nova.compute.provider_tree [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:20:34 compute-0 nova_compute[186329]: 2025-12-05 06:20:34.869 186333 DEBUG nova.scheduler.client.report [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:20:35 compute-0 nova_compute[186329]: 2025-12-05 06:20:35.375 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.072s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:35 compute-0 nova_compute[186329]: 2025-12-05 06:20:35.413 186333 INFO nova.scheduler.client.report [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Deleted allocations for instance 2a160ef8-a2f1-4959-a154-f46101bf8277
Dec 05 06:20:36 compute-0 nova_compute[186329]: 2025-12-05 06:20:36.429 186333 DEBUG oslo_concurrency.lockutils [None req-806962c0-0dc6-4d89-8d2d-00ef708e8f12 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2a160ef8-a2f1-4959-a154-f46101bf8277" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.933s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:36 compute-0 nova_compute[186329]: 2025-12-05 06:20:36.761 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.167 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2cc50d23-400e-4537-aba3-e0b30f79963a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.168 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.168 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.169 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.169 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.176 186333 INFO nova.compute.manager [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Terminating instance
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.685 186333 DEBUG nova.compute.manager [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:20:38 compute-0 kernel: tap2957b826-c9 (unregistering): left promiscuous mode
Dec 05 06:20:38 compute-0 NetworkManager[55434]: <info>  [1764915638.7063] device (tap2957b826-c9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.710 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 ovn_controller[95223]: 2025-12-05T06:20:38Z|00108|binding|INFO|Releasing lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd from this chassis (sb_readonly=0)
Dec 05 06:20:38 compute-0 ovn_controller[95223]: 2025-12-05T06:20:38Z|00109|binding|INFO|Setting lport 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd down in Southbound
Dec 05 06:20:38 compute-0 ovn_controller[95223]: 2025-12-05T06:20:38Z|00110|binding|INFO|Removing iface tap2957b826-c9 ovn-installed in OVS
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.713 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.716 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:53:de 10.100.0.4'], port_security=['fa:16:3e:1f:53:de 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2cc50d23-400e-4537-aba3-e0b30f79963a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72c4ee5cc96a42b99210abaf8ae6fcc3', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8deca48c-f664-4748-b23a-2c5f69c6b17a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1542876b-3afe-4c08-a982-954e6ce063b6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.717 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd in datapath df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 unbound from our chassis
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.725 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.725 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.726 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[360e33ca-c4e5-49a5-87f2-e9add51260e2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.727 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 namespace which is not needed anymore
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.734 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 05 06:20:38 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 1.898s CPU time.
Dec 05 06:20:38 compute-0 systemd-machined[152967]: Machine qemu-8-instance-0000000a terminated.
Dec 05 06:20:38 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [NOTICE]   (210156) : haproxy version is 3.0.5-8e879a5
Dec 05 06:20:38 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [NOTICE]   (210156) : path to executable is /usr/sbin/haproxy
Dec 05 06:20:38 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [WARNING]  (210156) : Exiting Master process...
Dec 05 06:20:38 compute-0 podman[210505]: 2025-12-05 06:20:38.803067782 +0000 UTC m=+0.019315676 container kill 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:20:38 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [ALERT]    (210156) : Current worker (210158) exited with code 143 (Terminated)
Dec 05 06:20:38 compute-0 neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923[210152]: [WARNING]  (210156) : All workers exited. Exiting... (0)
Dec 05 06:20:38 compute-0 systemd[1]: libpod-36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b.scope: Deactivated successfully.
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.831 186333 DEBUG nova.compute.manager [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Received event network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.832 186333 DEBUG oslo_concurrency.lockutils [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.834 186333 DEBUG oslo_concurrency.lockutils [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.834 186333 DEBUG oslo_concurrency.lockutils [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.834 186333 DEBUG nova.compute.manager [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] No waiting events found dispatching network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.834 186333 DEBUG nova.compute.manager [req-55d4c98a-8c08-4f0d-8990-33b9714747d5 req-40bbdb8b-0a96-4e52-bda5-a0e15e45351f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Received event network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:20:38 compute-0 podman[210517]: 2025-12-05 06:20:38.838074511 +0000 UTC m=+0.018492298 container died 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:20:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b-userdata-shm.mount: Deactivated successfully.
Dec 05 06:20:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-707212482bd7660b4a0dffe9231f71cc7f60d767844b8a522d5388605f21c193-merged.mount: Deactivated successfully.
Dec 05 06:20:38 compute-0 podman[210517]: 2025-12-05 06:20:38.867237861 +0000 UTC m=+0.047655639 container cleanup 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:20:38 compute-0 systemd[1]: libpod-conmon-36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b.scope: Deactivated successfully.
Dec 05 06:20:38 compute-0 podman[210527]: 2025-12-05 06:20:38.879774348 +0000 UTC m=+0.029844140 container remove 36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.883 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[771f265d-dcd4-43ea-8927-afc464344b45]: (4, ("Fri Dec  5 06:20:38 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 (36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b)\n36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b\nFri Dec  5 06:20:38 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 (36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b)\n36bb798422e0faccd57ea8e2e473cc1122192bfd9eb4e25d6c1576a6dc84c28b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.884 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[99b162a6-66dc-4e73-b6f8-417470cd6e33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.884 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df5a1a6f-32f3-42ca-8c18-60ea4ce9d923.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.884 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[21890d29-e47d-411a-9c1e-59bfd01e8d40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.885 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf5a1a6f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:38 compute-0 kernel: tapdf5a1a6f-30: left promiscuous mode
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.886 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 NetworkManager[55434]: <info>  [1764915638.9031] manager: (tap2957b826-c9): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.903 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.904 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.906 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0d0f52-85b6-4147-8d96-f1283df6356d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.914 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[44267676-8be7-4466-b2f1-17ddeb6233ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.914 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6098b2-ea4f-4c23-8c29-8549b2007826]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.926 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[abdbf67c-e20d-4f2d-bbc3-c6c4db7953de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345735, 'reachable_time': 43655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210552, 'error': None, 'target': 'ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 systemd[1]: run-netns-ovnmeta\x2ddf5a1a6f\x2d32f3\x2d42ca\x2d8c18\x2d60ea4ce9d923.mount: Deactivated successfully.
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.929 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df5a1a6f-32f3-42ca-8c18-60ea4ce9d923 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:20:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:38.929 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[94c4c6e2-f02a-4f86-95d9-b1cc81e4775a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.933 186333 INFO nova.virt.libvirt.driver [-] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Instance destroyed successfully.
Dec 05 06:20:38 compute-0 nova_compute[186329]: 2025-12-05 06:20:38.933 186333 DEBUG nova.objects.instance [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lazy-loading 'resources' on Instance uuid 2cc50d23-400e-4537-aba3-e0b30f79963a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.437 186333 DEBUG nova.virt.libvirt.vif [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:19:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-215099084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-215099084',id=10,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:19:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72c4ee5cc96a42b99210abaf8ae6fcc3',ramdisk_id='',reservation_id='r-98t7n2q0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-1144698866-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:20:19Z,user_data=None,user_id='6e966152abb6429a8d2dc82faf5464b5',uuid=2cc50d23-400e-4537-aba3-e0b30f79963a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.438 186333 DEBUG nova.network.os_vif_util [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converting VIF {"id": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "address": "fa:16:3e:1f:53:de", "network": {"id": "df5a1a6f-32f3-42ca-8c18-60ea4ce9d923", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1816665749-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36f184a410c54894823168ed0f00b1ce", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b826-c9", "ovs_interfaceid": "2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.438 186333 DEBUG nova.network.os_vif_util [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.439 186333 DEBUG os_vif [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.439 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.440 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2957b826-c9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.441 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.442 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.442 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.442 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2ee87e5a-d528-4b82-a6d0-3427744a473c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.445 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.446 186333 INFO os_vif [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:53:de,bridge_name='br-int',has_traffic_filtering=True,id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd,network=Network(df5a1a6f-32f3-42ca-8c18-60ea4ce9d923),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b826-c9')
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.446 186333 INFO nova.virt.libvirt.driver [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Deleting instance files /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a_del
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.447 186333 INFO nova.virt.libvirt.driver [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Deletion of /var/lib/nova/instances/2cc50d23-400e-4537-aba3-e0b30f79963a_del complete
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.955 186333 INFO nova.compute.manager [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Took 1.27 seconds to destroy the instance on the hypervisor.
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.955 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.955 186333 DEBUG nova.compute.manager [-] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.955 186333 DEBUG nova.network.neutron [-] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:20:39 compute-0 nova_compute[186329]: 2025-12-05 06:20:39.955 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.215 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.663 186333 DEBUG nova.compute.manager [req-1cf87355-86c8-4c05-bd5a-fe06f94c01f4 req-93d456a1-ef24-46c6-9b7b-0889b7ff4c54 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Received event network-vif-deleted-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.664 186333 INFO nova.compute.manager [req-1cf87355-86c8-4c05-bd5a-fe06f94c01f4 req-93d456a1-ef24-46c6-9b7b-0889b7ff4c54 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Neutron deleted interface 2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd; detaching it from the instance and deleting it from the info cache
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.664 186333 DEBUG nova.network.neutron [req-1cf87355-86c8-4c05-bd5a-fe06f94c01f4 req-93d456a1-ef24-46c6-9b7b-0889b7ff4c54 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.881 186333 DEBUG nova.compute.manager [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Received event network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.881 186333 DEBUG oslo_concurrency.lockutils [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.882 186333 DEBUG oslo_concurrency.lockutils [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.882 186333 DEBUG oslo_concurrency.lockutils [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.882 186333 DEBUG nova.compute.manager [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] No waiting events found dispatching network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:20:40 compute-0 nova_compute[186329]: 2025-12-05 06:20:40.882 186333 DEBUG nova.compute.manager [req-054dd8f6-6988-40b9-88ec-8b0144c94a66 req-8bc9e992-75dd-4c49-9207-224011256b2a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Received event network-vif-unplugged-2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:20:41 compute-0 nova_compute[186329]: 2025-12-05 06:20:41.124 186333 DEBUG nova.network.neutron [-] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:20:41 compute-0 nova_compute[186329]: 2025-12-05 06:20:41.168 186333 DEBUG nova.compute.manager [req-1cf87355-86c8-4c05-bd5a-fe06f94c01f4 req-93d456a1-ef24-46c6-9b7b-0889b7ff4c54 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Detach interface failed, port_id=2957b826-c9c7-44e6-bf3d-a6c1dc5cb3cd, reason: Instance 2cc50d23-400e-4537-aba3-e0b30f79963a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:20:41 compute-0 nova_compute[186329]: 2025-12-05 06:20:41.629 186333 INFO nova.compute.manager [-] [instance: 2cc50d23-400e-4537-aba3-e0b30f79963a] Took 1.67 seconds to deallocate network for instance.
Dec 05 06:20:42 compute-0 nova_compute[186329]: 2025-12-05 06:20:42.142 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:42 compute-0 nova_compute[186329]: 2025-12-05 06:20:42.142 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:42 compute-0 nova_compute[186329]: 2025-12-05 06:20:42.146 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:42 compute-0 nova_compute[186329]: 2025-12-05 06:20:42.193 186333 INFO nova.scheduler.client.report [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Deleted allocations for instance 2cc50d23-400e-4537-aba3-e0b30f79963a
Dec 05 06:20:43 compute-0 nova_compute[186329]: 2025-12-05 06:20:43.212 186333 DEBUG oslo_concurrency.lockutils [None req-37ae9d79-5fe5-46e5-a1f1-193fbfa910ca 6e966152abb6429a8d2dc82faf5464b5 72c4ee5cc96a42b99210abaf8ae6fcc3 - - default default] Lock "2cc50d23-400e-4537-aba3-e0b30f79963a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.044s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:43 compute-0 nova_compute[186329]: 2025-12-05 06:20:43.736 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:44 compute-0 nova_compute[186329]: 2025-12-05 06:20:44.210 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:44 compute-0 nova_compute[186329]: 2025-12-05 06:20:44.444 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:46 compute-0 nova_compute[186329]: 2025-12-05 06:20:46.951 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:47 compute-0 nova_compute[186329]: 2025-12-05 06:20:47.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:48 compute-0 nova_compute[186329]: 2025-12-05 06:20:48.738 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:49 compute-0 nova_compute[186329]: 2025-12-05 06:20:49.445 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:49 compute-0 nova_compute[186329]: 2025-12-05 06:20:49.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:49 compute-0 nova_compute[186329]: 2025-12-05 06:20:49.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:50 compute-0 podman[210561]: 2025-12-05 06:20:50.462439661 +0000 UTC m=+0.042354258 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:20:50 compute-0 podman[210560]: 2025-12-05 06:20:50.486462872 +0000 UTC m=+0.068393707 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:20:50 compute-0 nova_compute[186329]: 2025-12-05 06:20:50.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:50 compute-0 nova_compute[186329]: 2025-12-05 06:20:50.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:20:50 compute-0 nova_compute[186329]: 2025-12-05 06:20:50.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.219 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.394 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.395 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.407 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.407 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5887MB free_disk=73.16721725463867GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.408 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:20:51 compute-0 nova_compute[186329]: 2025-12-05 06:20:51.408 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:20:52 compute-0 nova_compute[186329]: 2025-12-05 06:20:52.969 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:20:52 compute-0 nova_compute[186329]: 2025-12-05 06:20:52.970 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:20:52 compute-0 nova_compute[186329]: 2025-12-05 06:20:52.970 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:20:51 up 58 min,  0 user,  load average: 0.14, 0.20, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:20:53 compute-0 nova_compute[186329]: 2025-12-05 06:20:53.042 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:20:53 compute-0 nova_compute[186329]: 2025-12-05 06:20:53.547 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:20:53 compute-0 nova_compute[186329]: 2025-12-05 06:20:53.739 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:54 compute-0 nova_compute[186329]: 2025-12-05 06:20:54.053 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:20:54 compute-0 nova_compute[186329]: 2025-12-05 06:20:54.054 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.646s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:20:54 compute-0 nova_compute[186329]: 2025-12-05 06:20:54.447 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:55 compute-0 nova_compute[186329]: 2025-12-05 06:20:55.050 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:55 compute-0 nova_compute[186329]: 2025-12-05 06:20:55.051 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:56 compute-0 nova_compute[186329]: 2025-12-05 06:20:56.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:20:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:58.229 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:77:7c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa7b2a65c9a54b598b902ce6fa21d41e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=23184677-e308-4f95-b4f6-6e02e8b7fc45) old=Port_Binding(mac=['fa:16:3e:1d:77:7c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fa7b2a65c9a54b598b902ce6fa21d41e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:20:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:58.230 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 23184677-e308-4f95-b4f6-6e02e8b7fc45 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 updated
Dec 05 06:20:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:58.230 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:20:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:20:58.231 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c79b2c28-f2fd-4323-9688-c280ae0664d2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:20:58 compute-0 nova_compute[186329]: 2025-12-05 06:20:58.742 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:59 compute-0 nova_compute[186329]: 2025-12-05 06:20:59.449 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:20:59 compute-0 podman[196599]: time="2025-12-05T06:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:20:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:20:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:21:00 compute-0 podman[210608]: 2025-12-05 06:21:00.454635564 +0000 UTC m=+0.039663741 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:21:00 compute-0 podman[210610]: 2025-12-05 06:21:00.459761978 +0000 UTC m=+0.042283426 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 05 06:21:00 compute-0 podman[210609]: 2025-12-05 06:21:00.481362718 +0000 UTC m=+0.065135426 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350)
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: ERROR   06:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: ERROR   06:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: ERROR   06:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: ERROR   06:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: ERROR   06:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:21:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:21:03 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:03.575 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:11:e6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-407bf781-7380-458c-992a-27796a19604d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-407bf781-7380-458c-992a-27796a19604d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9f292807-4cd7-4f73-8938-048c09c029cf, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=26db478d-8a39-43d2-96ac-673c8a21b550) old=Port_Binding(mac=['fa:16:3e:f9:11:e6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-407bf781-7380-458c-992a-27796a19604d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-407bf781-7380-458c-992a-27796a19604d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:21:03 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:03.575 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 26db478d-8a39-43d2-96ac-673c8a21b550 in datapath 407bf781-7380-458c-992a-27796a19604d updated
Dec 05 06:21:03 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:03.576 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 407bf781-7380-458c-992a-27796a19604d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:21:03 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:03.576 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4da1ffe7-ff08-4115-b83d-7e79ea0f57e7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:03 compute-0 nova_compute[186329]: 2025-12-05 06:21:03.744 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:04 compute-0 nova_compute[186329]: 2025-12-05 06:21:04.451 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:08 compute-0 nova_compute[186329]: 2025-12-05 06:21:08.745 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:09 compute-0 nova_compute[186329]: 2025-12-05 06:21:09.453 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:13 compute-0 nova_compute[186329]: 2025-12-05 06:21:13.747 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:14 compute-0 nova_compute[186329]: 2025-12-05 06:21:14.455 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:18 compute-0 nova_compute[186329]: 2025-12-05 06:21:18.748 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:19 compute-0 nova_compute[186329]: 2025-12-05 06:21:19.455 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:21 compute-0 podman[210661]: 2025-12-05 06:21:21.455399291 +0000 UTC m=+0.038400652 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:21:21 compute-0 podman[210660]: 2025-12-05 06:21:21.475404361 +0000 UTC m=+0.060844798 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 05 06:21:21 compute-0 ovn_controller[95223]: 2025-12-05T06:21:21Z|00111|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 06:21:23 compute-0 nova_compute[186329]: 2025-12-05 06:21:23.750 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:24 compute-0 nova_compute[186329]: 2025-12-05 06:21:24.457 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:28 compute-0 nova_compute[186329]: 2025-12-05 06:21:28.751 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:29 compute-0 nova_compute[186329]: 2025-12-05 06:21:29.459 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:29.504 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:29.504 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:29.504 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:29 compute-0 podman[196599]: time="2025-12-05T06:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:21:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:21:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec 05 06:21:30 compute-0 nova_compute[186329]: 2025-12-05 06:21:30.491 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:30 compute-0 nova_compute[186329]: 2025-12-05 06:21:30.491 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:30 compute-0 nova_compute[186329]: 2025-12-05 06:21:30.996 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: ERROR   06:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: ERROR   06:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: ERROR   06:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: ERROR   06:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: ERROR   06:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:21:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:21:31 compute-0 podman[210705]: 2025-12-05 06:21:31.465354478 +0000 UTC m=+0.052529445 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202)
Dec 05 06:21:31 compute-0 podman[210706]: 2025-12-05 06:21:31.46535026 +0000 UTC m=+0.050670069 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 05 06:21:31 compute-0 podman[210707]: 2025-12-05 06:21:31.473272944 +0000 UTC m=+0.056377027 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:21:31 compute-0 nova_compute[186329]: 2025-12-05 06:21:31.536 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:31 compute-0 nova_compute[186329]: 2025-12-05 06:21:31.536 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:31 compute-0 nova_compute[186329]: 2025-12-05 06:21:31.541 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:21:31 compute-0 nova_compute[186329]: 2025-12-05 06:21:31.541 186333 INFO nova.compute.claims [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:21:32 compute-0 nova_compute[186329]: 2025-12-05 06:21:32.597 186333 DEBUG nova.compute.provider_tree [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:21:33 compute-0 nova_compute[186329]: 2025-12-05 06:21:33.101 186333 DEBUG nova.scheduler.client.report [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:21:33 compute-0 nova_compute[186329]: 2025-12-05 06:21:33.608 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:33 compute-0 nova_compute[186329]: 2025-12-05 06:21:33.609 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:21:33 compute-0 nova_compute[186329]: 2025-12-05 06:21:33.753 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.117 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.117 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.117 186333 WARNING neutronclient.v2_0.client [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.118 186333 WARNING neutronclient.v2_0.client [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.460 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.622 186333 INFO nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:21:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:34.801 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:21:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:34.801 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.802 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:34 compute-0 nova_compute[186329]: 2025-12-05 06:21:34.902 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Successfully created port: e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.131 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.663 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Successfully updated port: e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.715 186333 DEBUG nova.compute.manager [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-changed-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.716 186333 DEBUG nova.compute.manager [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Refreshing instance network info cache due to event network-changed-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.716 186333 DEBUG oslo_concurrency.lockutils [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.716 186333 DEBUG oslo_concurrency.lockutils [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:21:35 compute-0 nova_compute[186329]: 2025-12-05 06:21:35.716 186333 DEBUG nova.network.neutron [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Refreshing network info cache for port e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.143 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.144 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.144 186333 INFO nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Creating image(s)
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.145 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.145 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.146 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.146 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.149 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.150 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.167 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.190 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.190 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.191 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.191 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.194 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.194 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.219 186333 WARNING neutronclient.v2_0.client [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.233 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.233 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.250 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.251 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.060s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.251 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.292 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.293 186333 DEBUG nova.virt.disk.api [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Checking if we can resize image /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.293 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.334 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.334 186333 DEBUG nova.virt.disk.api [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Cannot resize image /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.335 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.335 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Ensure instance console log exists: /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.335 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.336 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.336 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.517 186333 DEBUG nova.network.neutron [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:21:36 compute-0 nova_compute[186329]: 2025-12-05 06:21:36.642 186333 DEBUG nova.network.neutron [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:21:37 compute-0 nova_compute[186329]: 2025-12-05 06:21:37.147 186333 DEBUG oslo_concurrency.lockutils [req-e211b219-36d6-4bab-afce-f0f9cc5849da req-f67d1642-18eb-4ac9-9a26-8be524e4066f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:21:37 compute-0 nova_compute[186329]: 2025-12-05 06:21:37.148 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquired lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:21:37 compute-0 nova_compute[186329]: 2025-12-05 06:21:37.149 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:21:38 compute-0 nova_compute[186329]: 2025-12-05 06:21:38.525 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:21:38 compute-0 nova_compute[186329]: 2025-12-05 06:21:38.691 186333 WARNING neutronclient.v2_0.client [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:38 compute-0 nova_compute[186329]: 2025-12-05 06:21:38.753 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:39 compute-0 nova_compute[186329]: 2025-12-05 06:21:39.460 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:39 compute-0 nova_compute[186329]: 2025-12-05 06:21:39.594 186333 DEBUG nova.network.neutron [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Updating instance_info_cache with network_info: [{"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.099 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Releasing lock "refresh_cache-ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.100 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance network_info: |[{"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.102 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Start _get_guest_xml network_info=[{"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.105 186333 WARNING nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.106 186333 DEBUG nova.virt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1959212984', uuid='ee8648e9-dcf8-46a3-a89f-3d3dedafbe21'), owner=OwnerMeta(userid='1b1d9849dd3f4328991385825f24dc8f', username='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin', projectid='bc5d63a38e00424aa78cb06b6b41bc09', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915700.1059537) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.110 186333 DEBUG nova.virt.libvirt.host [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.111 186333 DEBUG nova.virt.libvirt.host [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.113 186333 DEBUG nova.virt.libvirt.host [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.114 186333 DEBUG nova.virt.libvirt.host [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.114 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.115 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.115 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.115 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.116 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.116 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.116 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.116 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.117 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.117 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.117 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.117 186333 DEBUG nova.virt.hardware [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.120 186333 DEBUG nova.virt.libvirt.vif [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:21:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1959212984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-195',id=13,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-mdfznq5f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:21:35Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=ee8648e9-dcf8-46a3-a89f-3d3dedafbe21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.120 186333 DEBUG nova.network.os_vif_util [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.121 186333 DEBUG nova.network.os_vif_util [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.122 186333 DEBUG nova.objects.instance [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee8648e9-dcf8-46a3-a89f-3d3dedafbe21 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.632 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <uuid>ee8648e9-dcf8-46a3-a89f-3d3dedafbe21</uuid>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <name>instance-0000000d</name>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-1959212984</nova:name>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:21:40</nova:creationTime>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:21:40 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:21:40 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:user uuid="1b1d9849dd3f4328991385825f24dc8f">tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin</nova:user>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:project uuid="bc5d63a38e00424aa78cb06b6b41bc09">tempest-TestExecuteNodeResourceConsolidationStrategy-409405411</nova:project>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         <nova:port uuid="e4b4ab1b-b70e-45e2-b062-e5da4d3c3385">
Dec 05 06:21:40 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <system>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="serial">ee8648e9-dcf8-46a3-a89f-3d3dedafbe21</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="uuid">ee8648e9-dcf8-46a3-a89f-3d3dedafbe21</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </system>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <os>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </os>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <features>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </features>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.config"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:55:b2:c5"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <target dev="tape4b4ab1b-b7"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/console.log" append="off"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <video>
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </video>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:21:40 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:21:40 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:21:40 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:21:40 compute-0 nova_compute[186329]: </domain>
Dec 05 06:21:40 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.634 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Preparing to wait for external event network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.634 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.634 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.635 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.635 186333 DEBUG nova.virt.libvirt.vif [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:21:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1959212984',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-195',id=13,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-mdfznq5f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:21:35Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=ee8648e9-dcf8-46a3-a89f-3d3dedafbe21,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.635 186333 DEBUG nova.network.os_vif_util [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.636 186333 DEBUG nova.network.os_vif_util [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.636 186333 DEBUG os_vif [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.637 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.637 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.638 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.638 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.638 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5291880f-016a-5d92-9d01-519b99cf4c1b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.639 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.640 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.642 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.642 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape4b4ab1b-b7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.642 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape4b4ab1b-b7, col_values=(('qos', UUID('d69436a2-7b4b-4833-90f1-57782ee257d1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.642 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape4b4ab1b-b7, col_values=(('external_ids', {'iface-id': 'e4b4ab1b-b70e-45e2-b062-e5da4d3c3385', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:b2:c5', 'vm-uuid': 'ee8648e9-dcf8-46a3-a89f-3d3dedafbe21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.643 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 NetworkManager[55434]: <info>  [1764915700.6443] manager: (tape4b4ab1b-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.646 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.647 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:40 compute-0 nova_compute[186329]: 2025-12-05 06:21:40.648 186333 INFO os_vif [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7')
Dec 05 06:21:41 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:41.802 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:42 compute-0 nova_compute[186329]: 2025-12-05 06:21:42.172 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:21:42 compute-0 nova_compute[186329]: 2025-12-05 06:21:42.173 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:21:42 compute-0 nova_compute[186329]: 2025-12-05 06:21:42.173 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No VIF found with MAC fa:16:3e:55:b2:c5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:21:42 compute-0 nova_compute[186329]: 2025-12-05 06:21:42.173 186333 INFO nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Using config drive
Dec 05 06:21:42 compute-0 nova_compute[186329]: 2025-12-05 06:21:42.680 186333 WARNING neutronclient.v2_0.client [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.566 186333 INFO nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Creating config drive at /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.config
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.570 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpg55bwp7g execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.688 186333 DEBUG oslo_concurrency.processutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpg55bwp7g" returned: 0 in 0.118s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:43 compute-0 kernel: tape4b4ab1b-b7: entered promiscuous mode
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.7263] manager: (tape4b4ab1b-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec 05 06:21:43 compute-0 ovn_controller[95223]: 2025-12-05T06:21:43Z|00112|binding|INFO|Claiming lport e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 for this chassis.
Dec 05 06:21:43 compute-0 ovn_controller[95223]: 2025-12-05T06:21:43Z|00113|binding|INFO|e4b4ab1b-b70e-45e2-b062-e5da4d3c3385: Claiming fa:16:3e:55:b2:c5 10.100.0.5
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.728 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.730 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.733 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.738 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:b2:c5 10.100.0.5'], port_security=['fa:16:3e:55:b2:c5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ee8648e9-dcf8-46a3-a89f-3d3dedafbe21', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.739 104041 INFO neutron.agent.ovn.metadata.agent [-] Port e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 bound to our chassis
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.739 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.752 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6deba6-5eb4-4e9a-af21-a400a264267f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 systemd-udevd[210793]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.756 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c6607ab-31 in ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:21:43 compute-0 systemd-machined[152967]: New machine qemu-9-instance-0000000d.
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.757 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c6607ab-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.757 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[20f0e864-751c-49d8-bc86-6caa13d2c152]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.758 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[76413847-58a8-4401-8b75-78dce3fad031]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.7659] device (tape4b4ab1b-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.7666] device (tape4b4ab1b-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.765 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[35815eeb-0f5f-46a6-9ad8-501f9794b32c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.789 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4da02ba7-982c-464b-9351-df49af4f43ee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.789 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_controller[95223]: 2025-12-05T06:21:43Z|00114|binding|INFO|Setting lport e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 ovn-installed in OVS
Dec 05 06:21:43 compute-0 ovn_controller[95223]: 2025-12-05T06:21:43Z|00115|binding|INFO|Setting lport e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 up in Southbound
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.795 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.806 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[129eb9be-cd31-481c-927f-ef077f68a039]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.8101] manager: (tap8c6607ab-30): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Dec 05 06:21:43 compute-0 systemd-udevd[210796]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.810 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[856bca7d-69e7-4cbd-9802-7885af799bb5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.832 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[21d9ffa2-3926-4cd9-89d8-d6ff4f528421]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.835 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[40278ad8-1537-4083-a29e-6b6e70782b4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.8482] device (tap8c6607ab-30): carrier: link connected
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.853 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e6344fad-dac6-41f3-8018-9e256db4b872]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.865 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[af203da3-3001-444d-ad31-f542767e0dbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358238, 'reachable_time': 18287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210817, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.875 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[89028e66-30b5-42db-951b-dfc40b403de4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:777c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358238, 'tstamp': 358238}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210818, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.885 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdfeeeb-aa93-4dbb-8668-9a679d73ed59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358238, 'reachable_time': 18287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210819, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.904 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2d97de-a53f-432d-8c61-e18ce8f5dc41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.942 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f8884022-812b-4db9-916f-d5a622c6ea30]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.942 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.943 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.943 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c6607ab-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:43 compute-0 NetworkManager[55434]: <info>  [1764915703.9448] manager: (tap8c6607ab-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec 05 06:21:43 compute-0 kernel: tap8c6607ab-30: entered promiscuous mode
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.945 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.949 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c6607ab-30, col_values=(('external_ids', {'iface-id': '23184677-e308-4f95-b4f6-6e02e8b7fc45'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.949 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_controller[95223]: 2025-12-05T06:21:43Z|00116|binding|INFO|Releasing lport 23184677-e308-4f95-b4f6-6e02e8b7fc45 from this chassis (sb_readonly=0)
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.967 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b33758de-9b68-4d4f-baec-22cc606e9337]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.967 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.968 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.968 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.968 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:21:43 compute-0 nova_compute[186329]: 2025-12-05 06:21:43.969 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.968 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6afe0a80-e99d-462e-b603-42f8a56f5abb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.970 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.970 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[77d5fbf3-d01c-4506-91bd-c1339cbc8fec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.970 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:21:43 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:21:43.972 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'env', 'PROCESS_TAG=haproxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:21:44 compute-0 podman[210852]: 2025-12-05 06:21:44.26818332 +0000 UTC m=+0.032038172 container create 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 05 06:21:44 compute-0 systemd[1]: Started libpod-conmon-881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa.scope.
Dec 05 06:21:44 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:21:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d279c169207874742a88ca4e6aeb0c993b0dbfda52138e5b1d94ff9facdf7ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:21:44 compute-0 podman[210852]: 2025-12-05 06:21:44.309076357 +0000 UTC m=+0.072931229 container init 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Dec 05 06:21:44 compute-0 podman[210852]: 2025-12-05 06:21:44.31436486 +0000 UTC m=+0.078219712 container start 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:21:44 compute-0 podman[210852]: 2025-12-05 06:21:44.253798095 +0000 UTC m=+0.017652968 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:21:44 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [NOTICE]   (210870) : New worker (210872) forked
Dec 05 06:21:44 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [NOTICE]   (210870) : Loading success.
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.627 186333 DEBUG nova.compute.manager [req-c310eadf-c798-467e-843c-bcee42e42674 req-e597c726-16dd-45c8-9d5b-bd5d941d7729 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.627 186333 DEBUG oslo_concurrency.lockutils [req-c310eadf-c798-467e-843c-bcee42e42674 req-e597c726-16dd-45c8-9d5b-bd5d941d7729 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.627 186333 DEBUG oslo_concurrency.lockutils [req-c310eadf-c798-467e-843c-bcee42e42674 req-e597c726-16dd-45c8-9d5b-bd5d941d7729 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.627 186333 DEBUG oslo_concurrency.lockutils [req-c310eadf-c798-467e-843c-bcee42e42674 req-e597c726-16dd-45c8-9d5b-bd5d941d7729 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.627 186333 DEBUG nova.compute.manager [req-c310eadf-c798-467e-843c-bcee42e42674 req-e597c726-16dd-45c8-9d5b-bd5d941d7729 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Processing event network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.628 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.632 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.634 186333 INFO nova.virt.libvirt.driver [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance spawned successfully.
Dec 05 06:21:44 compute-0 nova_compute[186329]: 2025-12-05 06:21:44.634 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.141 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.141 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.141 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.142 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.142 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.142 186333 DEBUG nova.virt.libvirt.driver [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.645 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.649 186333 INFO nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Took 9.51 seconds to spawn the instance on the hypervisor.
Dec 05 06:21:45 compute-0 nova_compute[186329]: 2025-12-05 06:21:45.649 186333 DEBUG nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.170 186333 INFO nova.compute.manager [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Took 14.67 seconds to build instance.
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.672 186333 DEBUG nova.compute.manager [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.672 186333 DEBUG oslo_concurrency.lockutils [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.672 186333 DEBUG oslo_concurrency.lockutils [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.672 186333 DEBUG oslo_concurrency.lockutils [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.672 186333 DEBUG nova.compute.manager [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] No waiting events found dispatching network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.673 186333 WARNING nova.compute.manager [req-4b0e2798-e373-4417-a73e-ba80f287fc31 req-d002906d-2867-4968-9ed5-913a4df0c89c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received unexpected event network-vif-plugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 for instance with vm_state active and task_state None.
Dec 05 06:21:46 compute-0 nova_compute[186329]: 2025-12-05 06:21:46.673 186333 DEBUG oslo_concurrency.lockutils [None req-e778a7f7-5e7c-4382-b961-5dcfedd3027e 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:47 compute-0 nova_compute[186329]: 2025-12-05 06:21:47.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:48 compute-0 nova_compute[186329]: 2025-12-05 06:21:48.792 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:50 compute-0 nova_compute[186329]: 2025-12-05 06:21:50.649 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:50 compute-0 nova_compute[186329]: 2025-12-05 06:21:50.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:51 compute-0 nova_compute[186329]: 2025-12-05 06:21:51.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:51 compute-0 nova_compute[186329]: 2025-12-05 06:21:51.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:51 compute-0 nova_compute[186329]: 2025-12-05 06:21:51.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:51 compute-0 nova_compute[186329]: 2025-12-05 06:21:51.220 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.303 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.303 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.357 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:52 compute-0 podman[210886]: 2025-12-05 06:21:52.493788295 +0000 UTC m=+0.078856898 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:21:52 compute-0 podman[210885]: 2025-12-05 06:21:52.533669711 +0000 UTC m=+0.117174103 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.603 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.604 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.619 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.619 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5732MB free_disk=73.16639709472656GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.620 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:21:52 compute-0 nova_compute[186329]: 2025-12-05 06:21:52.620 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:21:53 compute-0 nova_compute[186329]: 2025-12-05 06:21:53.792 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.162 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.163 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance ee8648e9-dcf8-46a3-a89f-3d3dedafbe21 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.163 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.164 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:21:52 up 59 min,  0 user,  load average: 0.20, 0.20, 0.28\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_bc5d63a38e00424aa78cb06b6b41bc09': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.200 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:21:54 compute-0 nova_compute[186329]: 2025-12-05 06:21:54.704 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:21:55 compute-0 nova_compute[186329]: 2025-12-05 06:21:55.211 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:21:55 compute-0 nova_compute[186329]: 2025-12-05 06:21:55.211 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.591s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:21:55 compute-0 ovn_controller[95223]: 2025-12-05T06:21:55Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:55:b2:c5 10.100.0.5
Dec 05 06:21:55 compute-0 ovn_controller[95223]: 2025-12-05T06:21:55Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:55:b2:c5 10.100.0.5
Dec 05 06:21:55 compute-0 nova_compute[186329]: 2025-12-05 06:21:55.651 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.206 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.207 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.207 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.207 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.208 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.208 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:21:56 compute-0 nova_compute[186329]: 2025-12-05 06:21:56.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:21:57 compute-0 nova_compute[186329]: 2025-12-05 06:21:57.549 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Creating tmpfile /var/lib/nova/instances/tmpaczowz2c to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:21:57 compute-0 nova_compute[186329]: 2025-12-05 06:21:57.550 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:57 compute-0 nova_compute[186329]: 2025-12-05 06:21:57.552 186333 DEBUG nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpaczowz2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:21:58 compute-0 nova_compute[186329]: 2025-12-05 06:21:58.794 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:21:59 compute-0 nova_compute[186329]: 2025-12-05 06:21:59.572 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:21:59 compute-0 podman[196599]: time="2025-12-05T06:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:21:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:21:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3048 "" "Go-http-client/1.1"
Dec 05 06:22:00 compute-0 nova_compute[186329]: 2025-12-05 06:22:00.654 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: ERROR   06:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: ERROR   06:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: ERROR   06:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: ERROR   06:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: ERROR   06:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:22:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:22:02 compute-0 podman[210941]: 2025-12-05 06:22:02.457537337 +0000 UTC m=+0.042935237 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 06:22:02 compute-0 podman[210942]: 2025-12-05 06:22:02.457536154 +0000 UTC m=+0.040548200 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, release=1755695350, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Dec 05 06:22:02 compute-0 podman[210943]: 2025-12-05 06:22:02.4994015 +0000 UTC m=+0.080870935 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 05 06:22:03 compute-0 nova_compute[186329]: 2025-12-05 06:22:03.401 186333 DEBUG nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpaczowz2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6f606379-b808-4558-9b6b-9d7501c6f2c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:22:03 compute-0 nova_compute[186329]: 2025-12-05 06:22:03.796 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:04 compute-0 nova_compute[186329]: 2025-12-05 06:22:04.410 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:22:04 compute-0 nova_compute[186329]: 2025-12-05 06:22:04.410 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:22:04 compute-0 nova_compute[186329]: 2025-12-05 06:22:04.411 186333 DEBUG nova.network.neutron [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:22:04 compute-0 nova_compute[186329]: 2025-12-05 06:22:04.915 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:05 compute-0 nova_compute[186329]: 2025-12-05 06:22:05.657 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:05 compute-0 nova_compute[186329]: 2025-12-05 06:22:05.758 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:05 compute-0 nova_compute[186329]: 2025-12-05 06:22:05.904 186333 DEBUG nova.network.neutron [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Updating instance_info_cache with network_info: [{"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.409 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.417 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpaczowz2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6f606379-b808-4558-9b6b-9d7501c6f2c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.417 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Creating instance directory: /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.417 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Creating disk.info with the contents: {'/var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk': 'qcow2', '/var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.418 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.418 186333 DEBUG nova.objects.instance [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6f606379-b808-4558-9b6b-9d7501c6f2c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.922 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.924 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.926 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.968 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.969 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.969 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.970 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.972 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:22:06 compute-0 nova_compute[186329]: 2025-12-05 06:22:06.972 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.013 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.014 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.033 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.033 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.064s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.034 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.073 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.074 186333 DEBUG nova.virt.disk.api [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.074 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.117 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.118 186333 DEBUG nova.virt.disk.api [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.118 186333 DEBUG nova.objects.instance [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 6f606379-b808-4558-9b6b-9d7501c6f2c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.622 186333 DEBUG nova.objects.base [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<6f606379-b808-4558-9b6b-9d7501c6f2c5> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.623 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.641 186333 DEBUG oslo_concurrency.processutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5/disk.config 497664" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.642 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.643 186333 DEBUG nova.virt.libvirt.vif [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:21:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-641343724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-641',id=12,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:21:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-sn2y1vt0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:21:26Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=6f606379-b808-4558-9b6b-9d7501c6f2c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.643 186333 DEBUG nova.network.os_vif_util [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.644 186333 DEBUG nova.network.os_vif_util [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.644 186333 DEBUG os_vif [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.644 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.645 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.645 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.646 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.647 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8eba999f-0075-506e-b7f5-c934ae183ac9', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.647 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.648 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.650 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.650 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd470aa0-c3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.651 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcd470aa0-c3, col_values=(('qos', UUID('527b7b3f-4f55-444a-b2bf-e6a3375bc816')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.651 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcd470aa0-c3, col_values=(('external_ids', {'iface-id': 'cd470aa0-c3d9-4008-b067-57c65c14fb38', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:fa:b2', 'vm-uuid': '6f606379-b808-4558-9b6b-9d7501c6f2c5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.652 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 NetworkManager[55434]: <info>  [1764915727.6526] manager: (tapcd470aa0-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.654 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.656 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.657 186333 INFO os_vif [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3')
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.657 186333 DEBUG nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.658 186333 DEBUG nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpaczowz2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6f606379-b808-4558-9b6b-9d7501c6f2c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:22:07 compute-0 nova_compute[186329]: 2025-12-05 06:22:07.658 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:08 compute-0 nova_compute[186329]: 2025-12-05 06:22:08.693 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:08 compute-0 nova_compute[186329]: 2025-12-05 06:22:08.797 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:10 compute-0 nova_compute[186329]: 2025-12-05 06:22:10.750 186333 DEBUG nova.network.neutron [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Port cd470aa0-c3d9-4008-b067-57c65c14fb38 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:22:10 compute-0 nova_compute[186329]: 2025-12-05 06:22:10.757 186333 DEBUG nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpaczowz2c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6f606379-b808-4558-9b6b-9d7501c6f2c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:22:12 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 06:22:12 compute-0 nova_compute[186329]: 2025-12-05 06:22:12.652 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:13 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 06:22:13 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 06:22:13 compute-0 kernel: tapcd470aa0-c3: entered promiscuous mode
Dec 05 06:22:13 compute-0 NetworkManager[55434]: <info>  [1764915733.4694] manager: (tapcd470aa0-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Dec 05 06:22:13 compute-0 ovn_controller[95223]: 2025-12-05T06:22:13Z|00117|binding|INFO|Claiming lport cd470aa0-c3d9-4008-b067-57c65c14fb38 for this additional chassis.
Dec 05 06:22:13 compute-0 nova_compute[186329]: 2025-12-05 06:22:13.471 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:13 compute-0 ovn_controller[95223]: 2025-12-05T06:22:13Z|00118|binding|INFO|cd470aa0-c3d9-4008-b067-57c65c14fb38: Claiming fa:16:3e:8c:fa:b2 10.100.0.11
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.483 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:fa:b2 10.100.0.11'], port_security=['fa:16:3e:8c:fa:b2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6f606379-b808-4558-9b6b-9d7501c6f2c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=cd470aa0-c3d9-4008-b067-57c65c14fb38) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.484 104041 INFO neutron.agent.ovn.metadata.agent [-] Port cd470aa0-c3d9-4008-b067-57c65c14fb38 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:22:13 compute-0 nova_compute[186329]: 2025-12-05 06:22:13.486 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:13 compute-0 nova_compute[186329]: 2025-12-05 06:22:13.488 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:13 compute-0 ovn_controller[95223]: 2025-12-05T06:22:13Z|00119|binding|INFO|Setting lport cd470aa0-c3d9-4008-b067-57c65c14fb38 ovn-installed in OVS
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.485 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.496 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[18e0afec-8336-463d-915e-12e08391ccf2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 systemd-udevd[211050]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:22:13 compute-0 systemd-machined[152967]: New machine qemu-10-instance-0000000c.
Dec 05 06:22:13 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.517 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[8ca017b2-482c-4903-bf95-e1b0ed2714f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 NetworkManager[55434]: <info>  [1764915733.5197] device (tapcd470aa0-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:22:13 compute-0 NetworkManager[55434]: <info>  [1764915733.5203] device (tapcd470aa0-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.522 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f569a497-c136-4949-a3a4-72900944fb89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.543 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[8853d44c-2461-4860-803a-6f9806c038e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.556 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[64f9464f-d4b8-4ca8-9777-e126df142247]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358238, 'reachable_time': 18287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211060, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.568 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[15c1bb47-e13d-49e4-908a-e39ac0a2df8d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8c6607ab-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358245, 'tstamp': 358245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211062, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8c6607ab-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358247, 'tstamp': 358247}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211062, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.570 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:13 compute-0 nova_compute[186329]: 2025-12-05 06:22:13.571 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.572 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c6607ab-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.572 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.573 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c6607ab-30, col_values=(('external_ids', {'iface-id': '23184677-e308-4f95-b4f6-6e02e8b7fc45'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.573 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:22:13 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:13.574 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fd73170f-c7b3-49bc-bf02-c415cd74101e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8c6607ab-315b-4ce0-bb4d-e22d0d588c81\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:13 compute-0 nova_compute[186329]: 2025-12-05 06:22:13.799 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:16 compute-0 ovn_controller[95223]: 2025-12-05T06:22:16Z|00120|binding|INFO|Claiming lport cd470aa0-c3d9-4008-b067-57c65c14fb38 for this chassis.
Dec 05 06:22:16 compute-0 ovn_controller[95223]: 2025-12-05T06:22:16Z|00121|binding|INFO|cd470aa0-c3d9-4008-b067-57c65c14fb38: Claiming fa:16:3e:8c:fa:b2 10.100.0.11
Dec 05 06:22:16 compute-0 ovn_controller[95223]: 2025-12-05T06:22:16Z|00122|binding|INFO|Setting lport cd470aa0-c3d9-4008-b067-57c65c14fb38 up in Southbound
Dec 05 06:22:17 compute-0 nova_compute[186329]: 2025-12-05 06:22:17.654 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:18 compute-0 nova_compute[186329]: 2025-12-05 06:22:18.630 186333 INFO nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Post operation of migration started
Dec 05 06:22:18 compute-0 nova_compute[186329]: 2025-12-05 06:22:18.631 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:18 compute-0 nova_compute[186329]: 2025-12-05 06:22:18.800 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:19 compute-0 nova_compute[186329]: 2025-12-05 06:22:19.548 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:19 compute-0 nova_compute[186329]: 2025-12-05 06:22:19.549 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:19 compute-0 nova_compute[186329]: 2025-12-05 06:22:19.651 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:22:19 compute-0 nova_compute[186329]: 2025-12-05 06:22:19.651 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:22:19 compute-0 nova_compute[186329]: 2025-12-05 06:22:19.651 186333 DEBUG nova.network.neutron [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:22:20 compute-0 nova_compute[186329]: 2025-12-05 06:22:20.155 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:20 compute-0 nova_compute[186329]: 2025-12-05 06:22:20.785 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:20 compute-0 nova_compute[186329]: 2025-12-05 06:22:20.899 186333 DEBUG nova.network.neutron [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Updating instance_info_cache with network_info: [{"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:22:21 compute-0 nova_compute[186329]: 2025-12-05 06:22:21.404 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-6f606379-b808-4558-9b6b-9d7501c6f2c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:22:21 compute-0 nova_compute[186329]: 2025-12-05 06:22:21.918 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:21 compute-0 nova_compute[186329]: 2025-12-05 06:22:21.918 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:21 compute-0 nova_compute[186329]: 2025-12-05 06:22:21.918 186333 DEBUG oslo_concurrency.lockutils [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:21 compute-0 nova_compute[186329]: 2025-12-05 06:22:21.921 186333 INFO nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:22:21 compute-0 virtqemud[186605]: Domain id=10 name='instance-0000000c' uuid=6f606379-b808-4558-9b6b-9d7501c6f2c5 is tainted: custom-monitor
Dec 05 06:22:22 compute-0 nova_compute[186329]: 2025-12-05 06:22:22.656 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:22 compute-0 nova_compute[186329]: 2025-12-05 06:22:22.926 186333 INFO nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:22:23 compute-0 podman[211083]: 2025-12-05 06:22:23.462375185 +0000 UTC m=+0.045989209 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:22:23 compute-0 podman[211082]: 2025-12-05 06:22:23.479642707 +0000 UTC m=+0.063346741 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec 05 06:22:23 compute-0 nova_compute[186329]: 2025-12-05 06:22:23.802 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:23 compute-0 nova_compute[186329]: 2025-12-05 06:22:23.930 186333 INFO nova.virt.libvirt.driver [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:22:23 compute-0 nova_compute[186329]: 2025-12-05 06:22:23.933 186333 DEBUG nova.compute.manager [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:22:24 compute-0 nova_compute[186329]: 2025-12-05 06:22:24.441 186333 DEBUG nova.objects.instance [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:22:25 compute-0 nova_compute[186329]: 2025-12-05 06:22:25.455 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:25 compute-0 nova_compute[186329]: 2025-12-05 06:22:25.538 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:25 compute-0 nova_compute[186329]: 2025-12-05 06:22:25.539 186333 WARNING neutronclient.v2_0.client [None req-e7ff9e29-cc3e-499c-a031-760bf1318c6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:27 compute-0 nova_compute[186329]: 2025-12-05 06:22:27.659 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:28 compute-0 nova_compute[186329]: 2025-12-05 06:22:28.803 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:29.505 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:29.506 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:29.507 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:29 compute-0 podman[196599]: time="2025-12-05T06:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:22:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:22:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3048 "" "Go-http-client/1.1"
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: ERROR   06:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: ERROR   06:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: ERROR   06:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: ERROR   06:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: ERROR   06:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:22:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:22:32 compute-0 nova_compute[186329]: 2025-12-05 06:22:32.660 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:33 compute-0 podman[211128]: 2025-12-05 06:22:33.463800656 +0000 UTC m=+0.043918285 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:22:33 compute-0 podman[211130]: 2025-12-05 06:22:33.479756363 +0000 UTC m=+0.055946178 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:22:33 compute-0 podman[211129]: 2025-12-05 06:22:33.505531018 +0000 UTC m=+0.082899789 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc.)
Dec 05 06:22:33 compute-0 nova_compute[186329]: 2025-12-05 06:22:33.806 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:37 compute-0 nova_compute[186329]: 2025-12-05 06:22:37.662 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.071 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.071 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.072 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.072 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.072 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.079 186333 INFO nova.compute.manager [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Terminating instance
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.588 186333 DEBUG nova.compute.manager [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:22:38 compute-0 kernel: tape4b4ab1b-b7 (unregistering): left promiscuous mode
Dec 05 06:22:38 compute-0 NetworkManager[55434]: <info>  [1764915758.6121] device (tape4b4ab1b-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.616 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 ovn_controller[95223]: 2025-12-05T06:22:38Z|00123|binding|INFO|Releasing lport e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 from this chassis (sb_readonly=0)
Dec 05 06:22:38 compute-0 ovn_controller[95223]: 2025-12-05T06:22:38Z|00124|binding|INFO|Setting lport e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 down in Southbound
Dec 05 06:22:38 compute-0 ovn_controller[95223]: 2025-12-05T06:22:38Z|00125|binding|INFO|Removing iface tape4b4ab1b-b7 ovn-installed in OVS
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.619 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.623 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:b2:c5 10.100.0.5'], port_security=['fa:16:3e:55:b2:c5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ee8648e9-dcf8-46a3-a89f-3d3dedafbe21', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.625 104041 INFO neutron.agent.ovn.metadata.agent [-] Port e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.626 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.631 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.639 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9a30d00c-9c76-4693-97da-e98c772f5c54]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 05 06:22:38 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 11.998s CPU time.
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.656 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[8a65c7ea-6cb5-4abf-8575-2beec9cdb068]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.657 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dabff2-b444-464b-8095-af3f349a8708]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 systemd-machined[152967]: Machine qemu-9-instance-0000000d terminated.
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.674 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[1e199847-b676-4db4-8d77-7bbd1d301128]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.686 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[50f3410a-703f-4f25-b32d-f58856f55e41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 8, 'rx_bytes': 1456, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358238, 'reachable_time': 18287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211190, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.695 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e10854d8-1c8d-45d1-854a-7ffa8f7c8f88]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8c6607ab-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358245, 'tstamp': 358245}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211191, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8c6607ab-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 358247, 'tstamp': 358247}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211191, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.696 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.697 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.700 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.700 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c6607ab-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.700 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.700 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c6607ab-30, col_values=(('external_ids', {'iface-id': '23184677-e308-4f95-b4f6-6e02e8b7fc45'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.700 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:22:38 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:38.701 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7493312f-2fd6-4d66-a5a7-3db1786b96d5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8c6607ab-315b-4ce0-bb4d-e22d0d588c81\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.743 186333 DEBUG nova.compute.manager [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.744 186333 DEBUG oslo_concurrency.lockutils [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.744 186333 DEBUG oslo_concurrency.lockutils [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.744 186333 DEBUG oslo_concurrency.lockutils [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.744 186333 DEBUG nova.compute.manager [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] No waiting events found dispatching network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.745 186333 DEBUG nova.compute.manager [req-c8a31450-05e5-4458-9f32-55d4f3e35a02 req-b52239ad-e4c8-43c7-a622-04424a1cd14c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.807 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.817 186333 INFO nova.virt.libvirt.driver [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Instance destroyed successfully.
Dec 05 06:22:38 compute-0 nova_compute[186329]: 2025-12-05 06:22:38.818 186333 DEBUG nova.objects.instance [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lazy-loading 'resources' on Instance uuid ee8648e9-dcf8-46a3-a89f-3d3dedafbe21 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.321 186333 DEBUG nova.virt.libvirt.vif [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:21:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1959212984',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-195',id=13,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:21:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-mdfznq5f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:21:45Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=ee8648e9-dcf8-46a3-a89f-3d3dedafbe21,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.322 186333 DEBUG nova.network.os_vif_util [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "address": "fa:16:3e:55:b2:c5", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape4b4ab1b-b7", "ovs_interfaceid": "e4b4ab1b-b70e-45e2-b062-e5da4d3c3385", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.322 186333 DEBUG nova.network.os_vif_util [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.322 186333 DEBUG os_vif [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.324 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.324 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape4b4ab1b-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.325 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.326 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.327 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.327 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d69436a2-7b4b-4833-90f1-57782ee257d1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.327 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.328 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.330 186333 INFO os_vif [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:b2:c5,bridge_name='br-int',has_traffic_filtering=True,id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape4b4ab1b-b7')
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.330 186333 INFO nova.virt.libvirt.driver [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Deleting instance files /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21_del
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.330 186333 INFO nova.virt.libvirt.driver [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Deletion of /var/lib/nova/instances/ee8648e9-dcf8-46a3-a89f-3d3dedafbe21_del complete
Dec 05 06:22:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:39.616 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.617 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:39 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:39.617 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.837 186333 INFO nova.compute.manager [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Took 1.25 seconds to destroy the instance on the hypervisor.
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.838 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.838 186333 DEBUG nova.compute.manager [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.838 186333 DEBUG nova.network.neutron [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:22:39 compute-0 nova_compute[186329]: 2025-12-05 06:22:39.838 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.215 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.477 186333 DEBUG nova.compute.manager [req-e9b3d118-ee7c-45c3-8da4-8a0578357e56 req-cf257425-0e83-47a2-bd7c-17dcb6dfcd39 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-deleted-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.478 186333 INFO nova.compute.manager [req-e9b3d118-ee7c-45c3-8da4-8a0578357e56 req-cf257425-0e83-47a2-bd7c-17dcb6dfcd39 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Neutron deleted interface e4b4ab1b-b70e-45e2-b062-e5da4d3c3385; detaching it from the instance and deleting it from the info cache
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.478 186333 DEBUG nova.network.neutron [req-e9b3d118-ee7c-45c3-8da4-8a0578357e56 req-cf257425-0e83-47a2-bd7c-17dcb6dfcd39 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.790 186333 DEBUG nova.compute.manager [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.791 186333 DEBUG oslo_concurrency.lockutils [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.791 186333 DEBUG oslo_concurrency.lockutils [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.791 186333 DEBUG oslo_concurrency.lockutils [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.792 186333 DEBUG nova.compute.manager [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] No waiting events found dispatching network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.792 186333 DEBUG nova.compute.manager [req-1a4fd045-75cd-48c5-ab93-dff6f293ca69 req-573ebb25-79ad-40f5-ba5e-9547c17853c7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Received event network-vif-unplugged-e4b4ab1b-b70e-45e2-b062-e5da4d3c3385 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.932 186333 DEBUG nova.network.neutron [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:22:40 compute-0 nova_compute[186329]: 2025-12-05 06:22:40.982 186333 DEBUG nova.compute.manager [req-e9b3d118-ee7c-45c3-8da4-8a0578357e56 req-cf257425-0e83-47a2-bd7c-17dcb6dfcd39 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Detach interface failed, port_id=e4b4ab1b-b70e-45e2-b062-e5da4d3c3385, reason: Instance ee8648e9-dcf8-46a3-a89f-3d3dedafbe21 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.437 186333 INFO nova.compute.manager [-] [instance: ee8648e9-dcf8-46a3-a89f-3d3dedafbe21] Took 1.60 seconds to deallocate network for instance.
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.949 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.950 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.981 186333 DEBUG nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.991 186333 DEBUG nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:22:41 compute-0 nova_compute[186329]: 2025-12-05 06:22:41.992 186333 DEBUG nova.compute.provider_tree [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:22:42 compute-0 nova_compute[186329]: 2025-12-05 06:22:42.000 186333 DEBUG nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:22:42 compute-0 nova_compute[186329]: 2025-12-05 06:22:42.019 186333 DEBUG nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:22:42 compute-0 nova_compute[186329]: 2025-12-05 06:22:42.059 186333 DEBUG nova.compute.provider_tree [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:22:42 compute-0 nova_compute[186329]: 2025-12-05 06:22:42.563 186333 DEBUG nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:22:42 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:42.617 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:43 compute-0 nova_compute[186329]: 2025-12-05 06:22:43.070 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:43 compute-0 nova_compute[186329]: 2025-12-05 06:22:43.088 186333 INFO nova.scheduler.client.report [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Deleted allocations for instance ee8648e9-dcf8-46a3-a89f-3d3dedafbe21
Dec 05 06:22:43 compute-0 nova_compute[186329]: 2025-12-05 06:22:43.807 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.105 186333 DEBUG oslo_concurrency.lockutils [None req-d39b406c-6bf7-4b46-a1fd-40a92a602d59 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "ee8648e9-dcf8-46a3-a89f-3d3dedafbe21" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.033s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.327 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.803 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "6f606379-b808-4558-9b6b-9d7501c6f2c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.804 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.804 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.804 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.804 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:44 compute-0 nova_compute[186329]: 2025-12-05 06:22:44.811 186333 INFO nova.compute.manager [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Terminating instance
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.321 186333 DEBUG nova.compute.manager [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:22:45 compute-0 kernel: tapcd470aa0-c3 (unregistering): left promiscuous mode
Dec 05 06:22:45 compute-0 NetworkManager[55434]: <info>  [1764915765.3484] device (tapcd470aa0-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:22:45 compute-0 ovn_controller[95223]: 2025-12-05T06:22:45Z|00126|binding|INFO|Releasing lport cd470aa0-c3d9-4008-b067-57c65c14fb38 from this chassis (sb_readonly=0)
Dec 05 06:22:45 compute-0 ovn_controller[95223]: 2025-12-05T06:22:45Z|00127|binding|INFO|Setting lport cd470aa0-c3d9-4008-b067-57c65c14fb38 down in Southbound
Dec 05 06:22:45 compute-0 ovn_controller[95223]: 2025-12-05T06:22:45Z|00128|binding|INFO|Removing iface tapcd470aa0-c3 ovn-installed in OVS
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.354 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.355 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.358 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:fa:b2 10.100.0.11'], port_security=['fa:16:3e:8c:fa:b2 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '6f606379-b808-4558-9b6b-9d7501c6f2c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '14', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=cd470aa0-c3d9-4008-b067-57c65c14fb38) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.358 104041 INFO neutron.agent.ovn.metadata.agent [-] Port cd470aa0-c3d9-4008-b067-57c65c14fb38 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.359 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.360 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fd56c148-a060-461f-b13c-b29e01d5aca2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.360 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace which is not needed anymore
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.372 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 05 06:22:45 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 2.771s CPU time.
Dec 05 06:22:45 compute-0 systemd-machined[152967]: Machine qemu-10-instance-0000000c terminated.
Dec 05 06:22:45 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [NOTICE]   (210870) : haproxy version is 3.0.5-8e879a5
Dec 05 06:22:45 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [NOTICE]   (210870) : path to executable is /usr/sbin/haproxy
Dec 05 06:22:45 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [WARNING]  (210870) : Exiting Master process...
Dec 05 06:22:45 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [ALERT]    (210870) : Current worker (210872) exited with code 143 (Terminated)
Dec 05 06:22:45 compute-0 podman[211230]: 2025-12-05 06:22:45.444889779 +0000 UTC m=+0.021214654 container kill 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:22:45 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[210866]: [WARNING]  (210870) : All workers exited. Exiting... (0)
Dec 05 06:22:45 compute-0 systemd[1]: libpod-881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa.scope: Deactivated successfully.
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.456 186333 DEBUG nova.compute.manager [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Received event network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.457 186333 DEBUG oslo_concurrency.lockutils [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.457 186333 DEBUG oslo_concurrency.lockutils [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.457 186333 DEBUG oslo_concurrency.lockutils [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.457 186333 DEBUG nova.compute.manager [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] No waiting events found dispatching network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.457 186333 DEBUG nova.compute.manager [req-5455f6c7-ae63-4ad5-8990-72c8f45aaffd req-41dbc71e-dfec-4bdf-a247-6b383e70d968 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Received event network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:22:45 compute-0 podman[211242]: 2025-12-05 06:22:45.479144848 +0000 UTC m=+0.018144993 container died 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:22:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa-userdata-shm.mount: Deactivated successfully.
Dec 05 06:22:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-6d279c169207874742a88ca4e6aeb0c993b0dbfda52138e5b1d94ff9facdf7ec-merged.mount: Deactivated successfully.
Dec 05 06:22:45 compute-0 podman[211242]: 2025-12-05 06:22:45.497399926 +0000 UTC m=+0.036400051 container cleanup 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:22:45 compute-0 systemd[1]: libpod-conmon-881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa.scope: Deactivated successfully.
Dec 05 06:22:45 compute-0 podman[211243]: 2025-12-05 06:22:45.507966651 +0000 UTC m=+0.045729641 container remove 881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.511 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3eab8620-9e36-4a68-91c2-7cd942df0348]: (4, ("Fri Dec  5 06:22:45 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa)\n881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa\nFri Dec  5 06:22:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa)\n881badd4f71b9610f3f7c5ab5c0f6f2b28e18ed72a681d49622e6c4aa3e116fa\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.512 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3073ab-5aa7-45fe-8da9-5d11f29f9f39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.513 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.513 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[96be9073-303a-416f-a671-2016aabd370b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.514 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.515 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:45 compute-0 kernel: tap8c6607ab-30: left promiscuous mode
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.531 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:45 compute-0 NetworkManager[55434]: <info>  [1764915765.5333] manager: (tapcd470aa0-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.534 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4edcb1b8-8258-4b16-808f-cfe447d5a650]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.546 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d5bd27ad-ad34-464a-9039-09211ee18171]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.546 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[eadb48bc-01de-41af-84c8-8fa2aa835f97]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.559 186333 INFO nova.virt.libvirt.driver [-] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Instance destroyed successfully.
Dec 05 06:22:45 compute-0 nova_compute[186329]: 2025-12-05 06:22:45.559 186333 DEBUG nova.objects.instance [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lazy-loading 'resources' on Instance uuid 6f606379-b808-4558-9b6b-9d7501c6f2c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.560 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3654199f-55b0-43f9-a839-a569f8c201a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 358233, 'reachable_time': 17740, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211281, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:45 compute-0 systemd[1]: run-netns-ovnmeta\x2d8c6607ab\x2d315b\x2d4ce0\x2dbb4d\x2de22d0d588c81.mount: Deactivated successfully.
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.561 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:22:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:22:45.562 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[57af62cb-2dec-4567-a0b5-8a28e07591b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.064 186333 DEBUG nova.virt.libvirt.vif [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:21:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-641343724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-641',id=12,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:21:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-sn2y1vt0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:22:24Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=6f606379-b808-4558-9b6b-9d7501c6f2c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.064 186333 DEBUG nova.network.os_vif_util [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "address": "fa:16:3e:8c:fa:b2", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcd470aa0-c3", "ovs_interfaceid": "cd470aa0-c3d9-4008-b067-57c65c14fb38", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.065 186333 DEBUG nova.network.os_vif_util [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.065 186333 DEBUG os_vif [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.066 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.066 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd470aa0-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.068 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.070 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.071 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.071 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=527b7b3f-4f55-444a-b2bf-e6a3375bc816) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.072 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.072 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.074 186333 INFO os_vif [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:fa:b2,bridge_name='br-int',has_traffic_filtering=True,id=cd470aa0-c3d9-4008-b067-57c65c14fb38,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcd470aa0-c3')
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.074 186333 INFO nova.virt.libvirt.driver [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Deleting instance files /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5_del
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.075 186333 INFO nova.virt.libvirt.driver [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Deletion of /var/lib/nova/instances/6f606379-b808-4558-9b6b-9d7501c6f2c5_del complete
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.582 186333 INFO nova.compute.manager [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.583 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.583 186333 DEBUG nova.compute.manager [-] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.583 186333 DEBUG nova.network.neutron [-] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.583 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:46 compute-0 nova_compute[186329]: 2025-12-05 06:22:46.732 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.470 186333 DEBUG nova.network.neutron [-] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.504 186333 DEBUG nova.compute.manager [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Received event network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.505 186333 DEBUG oslo_concurrency.lockutils [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.505 186333 DEBUG oslo_concurrency.lockutils [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.505 186333 DEBUG oslo_concurrency.lockutils [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.505 186333 DEBUG nova.compute.manager [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] No waiting events found dispatching network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.505 186333 DEBUG nova.compute.manager [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Received event network-vif-unplugged-cd470aa0-c3d9-4008-b067-57c65c14fb38 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.506 186333 DEBUG nova.compute.manager [req-5e648575-11b5-4c12-835e-e5e113c3ba7f req-5586d514-ed2f-4778-a2b4-1b62d8f9a910 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Received event network-vif-deleted-cd470aa0-c3d9-4008-b067-57c65c14fb38 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:22:47 compute-0 nova_compute[186329]: 2025-12-05 06:22:47.975 186333 INFO nova.compute.manager [-] [instance: 6f606379-b808-4558-9b6b-9d7501c6f2c5] Took 1.39 seconds to deallocate network for instance.
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.487 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.488 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.492 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.515 186333 INFO nova.scheduler.client.report [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Deleted allocations for instance 6f606379-b808-4558-9b6b-9d7501c6f2c5
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:48 compute-0 nova_compute[186329]: 2025-12-05 06:22:48.808 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:49 compute-0 nova_compute[186329]: 2025-12-05 06:22:49.214 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:49 compute-0 nova_compute[186329]: 2025-12-05 06:22:49.530 186333 DEBUG oslo_concurrency.lockutils [None req-d9a94576-7a19-4fb0-a170-e5a57d7695a8 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "6f606379-b808-4558-9b6b-9d7501c6f2c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.726s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:51 compute-0 nova_compute[186329]: 2025-12-05 06:22:51.073 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:51 compute-0 nova_compute[186329]: 2025-12-05 06:22:51.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:51 compute-0 nova_compute[186329]: 2025-12-05 06:22:51.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:52 compute-0 nova_compute[186329]: 2025-12-05 06:22:52.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:52 compute-0 nova_compute[186329]: 2025-12-05 06:22:52.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:22:52 compute-0 nova_compute[186329]: 2025-12-05 06:22:52.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.222 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.222 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.398 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.399 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.412 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.413 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5858MB free_disk=73.16719818115234GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.413 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.413 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:22:53 compute-0 nova_compute[186329]: 2025-12-05 06:22:53.810 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:54 compute-0 podman[211290]: 2025-12-05 06:22:54.463452076 +0000 UTC m=+0.045183024 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:22:54 compute-0 podman[211289]: 2025-12-05 06:22:54.482136131 +0000 UTC m=+0.064477938 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 05 06:22:54 compute-0 nova_compute[186329]: 2025-12-05 06:22:54.954 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:22:54 compute-0 nova_compute[186329]: 2025-12-05 06:22:54.954 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:22:54 compute-0 nova_compute[186329]: 2025-12-05 06:22:54.954 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:22:53 up  1:00,  0 user,  load average: 0.07, 0.16, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:22:54 compute-0 nova_compute[186329]: 2025-12-05 06:22:54.981 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:22:55 compute-0 nova_compute[186329]: 2025-12-05 06:22:55.486 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:22:55 compute-0 nova_compute[186329]: 2025-12-05 06:22:55.992 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:22:55 compute-0 nova_compute[186329]: 2025-12-05 06:22:55.992 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.579s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:22:56 compute-0 nova_compute[186329]: 2025-12-05 06:22:56.075 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:56 compute-0 nova_compute[186329]: 2025-12-05 06:22:56.992 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:56 compute-0 nova_compute[186329]: 2025-12-05 06:22:56.993 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:58 compute-0 nova_compute[186329]: 2025-12-05 06:22:58.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:22:58 compute-0 nova_compute[186329]: 2025-12-05 06:22:58.811 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:22:59 compute-0 podman[196599]: time="2025-12-05T06:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:22:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:22:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:23:01 compute-0 nova_compute[186329]: 2025-12-05 06:23:01.077 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: ERROR   06:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: ERROR   06:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: ERROR   06:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: ERROR   06:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: ERROR   06:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:23:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:23:03 compute-0 nova_compute[186329]: 2025-12-05 06:23:03.813 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:04 compute-0 podman[211335]: 2025-12-05 06:23:04.457724017 +0000 UTC m=+0.041638519 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:23:04 compute-0 podman[211337]: 2025-12-05 06:23:04.468237752 +0000 UTC m=+0.045830079 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 06:23:04 compute-0 podman[211336]: 2025-12-05 06:23:04.477357638 +0000 UTC m=+0.059083697 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.buildah.version=1.33.7)
Dec 05 06:23:06 compute-0 nova_compute[186329]: 2025-12-05 06:23:06.079 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:08 compute-0 nova_compute[186329]: 2025-12-05 06:23:08.814 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:11 compute-0 nova_compute[186329]: 2025-12-05 06:23:11.080 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:11 compute-0 nova_compute[186329]: 2025-12-05 06:23:11.534 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:11 compute-0 nova_compute[186329]: 2025-12-05 06:23:11.534 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:12 compute-0 nova_compute[186329]: 2025-12-05 06:23:12.037 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:23:12 compute-0 nova_compute[186329]: 2025-12-05 06:23:12.573 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:12 compute-0 nova_compute[186329]: 2025-12-05 06:23:12.573 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:12 compute-0 nova_compute[186329]: 2025-12-05 06:23:12.577 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:23:12 compute-0 nova_compute[186329]: 2025-12-05 06:23:12.577 186333 INFO nova.compute.claims [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:23:13 compute-0 nova_compute[186329]: 2025-12-05 06:23:13.621 186333 DEBUG nova.compute.provider_tree [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:23:13 compute-0 nova_compute[186329]: 2025-12-05 06:23:13.815 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:14 compute-0 nova_compute[186329]: 2025-12-05 06:23:14.125 186333 DEBUG nova.scheduler.client.report [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:23:14 compute-0 nova_compute[186329]: 2025-12-05 06:23:14.632 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:14 compute-0 nova_compute[186329]: 2025-12-05 06:23:14.633 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.140 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.141 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.141 186333 WARNING neutronclient.v2_0.client [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.142 186333 WARNING neutronclient.v2_0.client [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.647 186333 INFO nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:23:15 compute-0 nova_compute[186329]: 2025-12-05 06:23:15.903 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Successfully created port: 3f0185ae-482f-4962-94bc-a3aea2d2ea77 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.082 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.152 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.759 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Successfully updated port: 3f0185ae-482f-4962-94bc-a3aea2d2ea77 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.807 186333 DEBUG nova.compute.manager [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-changed-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.807 186333 DEBUG nova.compute.manager [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Refreshing instance network info cache due to event network-changed-3f0185ae-482f-4962-94bc-a3aea2d2ea77. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.808 186333 DEBUG oslo_concurrency.lockutils [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.808 186333 DEBUG oslo_concurrency.lockutils [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:23:16 compute-0 nova_compute[186329]: 2025-12-05 06:23:16.808 186333 DEBUG nova.network.neutron [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Refreshing network info cache for port 3f0185ae-482f-4962-94bc-a3aea2d2ea77 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.164 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.165 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.165 186333 INFO nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Creating image(s)
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.166 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.166 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.167 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.167 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.169 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.170 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.213 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.214 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.214 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.215 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.217 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.218 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.262 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.263 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.269 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.284 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.285 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.070s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.285 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.312 186333 WARNING neutronclient.v2_0.client [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.327 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.327 186333 DEBUG nova.virt.disk.api [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Checking if we can resize image /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.328 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.370 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.371 186333 DEBUG nova.virt.disk.api [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Cannot resize image /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.371 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.371 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Ensure instance console log exists: /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.372 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.372 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.372 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.534 186333 DEBUG nova.network.neutron [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:23:17 compute-0 nova_compute[186329]: 2025-12-05 06:23:17.628 186333 DEBUG nova.network.neutron [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:23:18 compute-0 nova_compute[186329]: 2025-12-05 06:23:18.132 186333 DEBUG oslo_concurrency.lockutils [req-baff69b2-34eb-4c22-a9a3-a29741c1fa4c req-c3b55ea1-d455-4d5c-a7f7-5c8c162bba76 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:23:18 compute-0 nova_compute[186329]: 2025-12-05 06:23:18.133 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquired lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:23:18 compute-0 nova_compute[186329]: 2025-12-05 06:23:18.133 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:23:18 compute-0 nova_compute[186329]: 2025-12-05 06:23:18.818 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:19 compute-0 nova_compute[186329]: 2025-12-05 06:23:19.535 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:23:20 compute-0 nova_compute[186329]: 2025-12-05 06:23:20.525 186333 WARNING neutronclient.v2_0.client [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:20 compute-0 nova_compute[186329]: 2025-12-05 06:23:20.681 186333 DEBUG nova.network.neutron [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Updating instance_info_cache with network_info: [{"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.083 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.186 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Releasing lock "refresh_cache-8ed77332-76f2-4438-9d44-961742779ff9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.186 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance network_info: |[{"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.188 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Start _get_guest_xml network_info=[{"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.190 186333 WARNING nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.191 186333 DEBUG nova.virt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-496915993', uuid='8ed77332-76f2-4438-9d44-961742779ff9'), owner=OwnerMeta(userid='1b1d9849dd3f4328991385825f24dc8f', username='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin', projectid='bc5d63a38e00424aa78cb06b6b41bc09', projectname='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915801.1918209) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.195 186333 DEBUG nova.virt.libvirt.host [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.195 186333 DEBUG nova.virt.libvirt.host [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.198 186333 DEBUG nova.virt.libvirt.host [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.199 186333 DEBUG nova.virt.libvirt.host [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.199 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.200 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.200 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.200 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.200 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.200 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.201 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.201 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.201 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.201 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.201 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.202 186333 DEBUG nova.virt.hardware [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.204 186333 DEBUG nova.virt.libvirt.vif [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:23:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-496915993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-496',id=15,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-973d39j5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:23:16Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=8ed77332-76f2-4438-9d44-961742779ff9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.205 186333 DEBUG nova.network.os_vif_util [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.205 186333 DEBUG nova.network.os_vif_util [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.206 186333 DEBUG nova.objects.instance [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ed77332-76f2-4438-9d44-961742779ff9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.711 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <uuid>8ed77332-76f2-4438-9d44-961742779ff9</uuid>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <name>instance-0000000f</name>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteNodeResourceConsolidationStrategy-server-496915993</nova:name>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:23:21</nova:creationTime>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:23:21 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:23:21 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:user uuid="1b1d9849dd3f4328991385825f24dc8f">tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin</nova:user>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:project uuid="bc5d63a38e00424aa78cb06b6b41bc09">tempest-TestExecuteNodeResourceConsolidationStrategy-409405411</nova:project>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         <nova:port uuid="3f0185ae-482f-4962-94bc-a3aea2d2ea77">
Dec 05 06:23:21 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <system>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="serial">8ed77332-76f2-4438-9d44-961742779ff9</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="uuid">8ed77332-76f2-4438-9d44-961742779ff9</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </system>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <os>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </os>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <features>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </features>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.config"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:12:a1:86"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <target dev="tap3f0185ae-48"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/console.log" append="off"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <video>
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </video>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:23:21 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:23:21 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:23:21 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:23:21 compute-0 nova_compute[186329]: </domain>
Dec 05 06:23:21 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.712 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Preparing to wait for external event network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.712 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.712 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.713 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.713 186333 DEBUG nova.virt.libvirt.vif [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:23:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-496915993',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-496',id=15,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-973d39j5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:23:16Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=8ed77332-76f2-4438-9d44-961742779ff9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.713 186333 DEBUG nova.network.os_vif_util [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.714 186333 DEBUG nova.network.os_vif_util [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.714 186333 DEBUG os_vif [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.714 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.715 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.715 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.715 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.716 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '17805363-d9f7-5784-86be-6994aed68fd0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.720 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.720 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f0185ae-48, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.720 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3f0185ae-48, col_values=(('qos', UUID('6685d8f5-fe2d-46e0-9dc8-71a6c0669a36')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.720 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3f0185ae-48, col_values=(('external_ids', {'iface-id': '3f0185ae-482f-4962-94bc-a3aea2d2ea77', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:a1:86', 'vm-uuid': '8ed77332-76f2-4438-9d44-961742779ff9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.721 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 NetworkManager[55434]: <info>  [1764915801.7222] manager: (tap3f0185ae-48): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.723 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.725 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:21 compute-0 nova_compute[186329]: 2025-12-05 06:23:21.725 186333 INFO os_vif [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48')
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.250 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.251 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.251 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] No VIF found with MAC fa:16:3e:12:a1:86, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.251 186333 INFO nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Using config drive
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.759 186333 WARNING neutronclient.v2_0.client [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:23 compute-0 nova_compute[186329]: 2025-12-05 06:23:23.819 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.581 186333 INFO nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Creating config drive at /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.config
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.586 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp86btc425 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.702 186333 DEBUG oslo_concurrency.processutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmp86btc425" returned: 0 in 0.116s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:24 compute-0 NetworkManager[55434]: <info>  [1764915804.7531] manager: (tap3f0185ae-48): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Dec 05 06:23:24 compute-0 kernel: tap3f0185ae-48: entered promiscuous mode
Dec 05 06:23:24 compute-0 ovn_controller[95223]: 2025-12-05T06:23:24Z|00129|binding|INFO|Claiming lport 3f0185ae-482f-4962-94bc-a3aea2d2ea77 for this chassis.
Dec 05 06:23:24 compute-0 ovn_controller[95223]: 2025-12-05T06:23:24Z|00130|binding|INFO|3f0185ae-482f-4962-94bc-a3aea2d2ea77: Claiming fa:16:3e:12:a1:86 10.100.0.9
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.757 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.760 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:a1:86 10.100.0.9'], port_security=['fa:16:3e:12:a1:86 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ed77332-76f2-4438-9d44-961742779ff9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=3f0185ae-482f-4962-94bc-a3aea2d2ea77) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.761 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 3f0185ae-482f-4962-94bc-a3aea2d2ea77 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 bound to our chassis
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.762 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:24 compute-0 ovn_controller[95223]: 2025-12-05T06:23:24Z|00131|binding|INFO|Setting lport 3f0185ae-482f-4962-94bc-a3aea2d2ea77 up in Southbound
Dec 05 06:23:24 compute-0 ovn_controller[95223]: 2025-12-05T06:23:24Z|00132|binding|INFO|Setting lport 3f0185ae-482f-4962-94bc-a3aea2d2ea77 ovn-installed in OVS
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.778 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.783 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[234cf4b3-1e9a-42cf-98bb-ee3d1e924a6b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.785 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c6607ab-31 in ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.786 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c6607ab-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.786 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[158a6bf4-76ac-439f-b5e2-3b61c0e728cc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.787 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d63e07be-f714-4858-a658-704b010f4770]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 systemd-udevd[211437]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.797 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[1182ea08-9d57-4e2b-b2f1-5299350defc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.804 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[59a43879-cf97-433a-9ad4-b154d32ccc43]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 NetworkManager[55434]: <info>  [1764915804.8074] device (tap3f0185ae-48): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:23:24 compute-0 NetworkManager[55434]: <info>  [1764915804.8081] device (tap3f0185ae-48): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:23:24 compute-0 systemd-machined[152967]: New machine qemu-11-instance-0000000f.
Dec 05 06:23:24 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.829 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[febba568-6e4e-405b-a7e7-77946c55e8f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 systemd-udevd[211451]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.834 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb8eef1-25e0-4f0c-92a4-e06a0bff31b6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 NetworkManager[55434]: <info>  [1764915804.8345] manager: (tap8c6607ab-30): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Dec 05 06:23:24 compute-0 podman[211414]: 2025-12-05 06:23:24.835686379 +0000 UTC m=+0.084803103 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.869 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccd04c1-7dc6-41e0-a788-a71befe01307]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.871 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[05bab1ad-a94f-4919-be91-21cdb12a5fa5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 NetworkManager[55434]: <info>  [1764915804.8881] device (tap8c6607ab-30): carrier: link connected
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.891 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cb81f9-7f36-47c1-b6e2-dca69f80ebce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 podman[211413]: 2025-12-05 06:23:24.898508353 +0000 UTC m=+0.149449377 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.906 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1cef9727-e0d5-4369-ab06-5b0512bcd6f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368342, 'reachable_time': 17508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211490, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.919 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e23f3644-6df9-4c36-89d6-2a9683582612]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:777c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368342, 'tstamp': 368342}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211491, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.931 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ba410dd7-d8a9-4526-ab98-648335a498e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368342, 'reachable_time': 17508, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211492, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.952 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c23d1d59-42d0-444a-b5ce-4c6400ec8d32]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.990 186333 DEBUG nova.compute.manager [req-9c26d69e-251e-4df6-8d4d-f1f6a512b37f req-085e31b2-0bd4-42b4-9d17-519e6df76b09 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.990 186333 DEBUG oslo_concurrency.lockutils [req-9c26d69e-251e-4df6-8d4d-f1f6a512b37f req-085e31b2-0bd4-42b4-9d17-519e6df76b09 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.991 186333 DEBUG oslo_concurrency.lockutils [req-9c26d69e-251e-4df6-8d4d-f1f6a512b37f req-085e31b2-0bd4-42b4-9d17-519e6df76b09 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.991 186333 DEBUG oslo_concurrency.lockutils [req-9c26d69e-251e-4df6-8d4d-f1f6a512b37f req-085e31b2-0bd4-42b4-9d17-519e6df76b09 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:24 compute-0 nova_compute[186329]: 2025-12-05 06:23:24.991 186333 DEBUG nova.compute.manager [req-9c26d69e-251e-4df6-8d4d-f1f6a512b37f req-085e31b2-0bd4-42b4-9d17-519e6df76b09 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Processing event network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:24.999 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[86b5b27e-b571-4275-a141-692cd51b7dca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.000 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.000 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.001 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c6607ab-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:25 compute-0 NetworkManager[55434]: <info>  [1764915805.0029] manager: (tap8c6607ab-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec 05 06:23:25 compute-0 kernel: tap8c6607ab-30: entered promiscuous mode
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.002 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.004 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.005 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c6607ab-30, col_values=(('external_ids', {'iface-id': '23184677-e308-4f95-b4f6-6e02e8b7fc45'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.005 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:25 compute-0 ovn_controller[95223]: 2025-12-05T06:23:25Z|00133|binding|INFO|Releasing lport 23184677-e308-4f95-b4f6-6e02e8b7fc45 from this chassis (sb_readonly=0)
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.017 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.018 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[898ff622-e2b8-4789-afb1-550f8c163052]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.018 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.018 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.019 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.019 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.019 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[aa5409d7-43b5-41c0-9ebd-01a107bd6f60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.020 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.020 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3baabb-27c5-43cd-9caa-247b2fae300a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.021 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:23:25 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:25.021 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'env', 'PROCESS_TAG=haproxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.076 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.080 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.082 186333 INFO nova.virt.libvirt.driver [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance spawned successfully.
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.082 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:23:25 compute-0 podman[211527]: 2025-12-05 06:23:25.336252747 +0000 UTC m=+0.035427075 container create 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:23:25 compute-0 systemd[1]: Started libpod-conmon-2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040.scope.
Dec 05 06:23:25 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:23:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14e78fb306d8605ace27d70f906f3e1a2bbd2856c00f4d414b28da557d342bc4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:23:25 compute-0 podman[211527]: 2025-12-05 06:23:25.398162104 +0000 UTC m=+0.097336453 container init 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 05 06:23:25 compute-0 podman[211527]: 2025-12-05 06:23:25.402903391 +0000 UTC m=+0.102077720 container start 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:23:25 compute-0 podman[211527]: 2025-12-05 06:23:25.320216529 +0000 UTC m=+0.019390888 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:23:25 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [NOTICE]   (211543) : New worker (211545) forked
Dec 05 06:23:25 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [NOTICE]   (211543) : Loading success.
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.591 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.591 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.592 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.592 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.592 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:25 compute-0 nova_compute[186329]: 2025-12-05 06:23:25.593 186333 DEBUG nova.virt.libvirt.driver [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:23:26 compute-0 nova_compute[186329]: 2025-12-05 06:23:26.099 186333 INFO nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Took 8.93 seconds to spawn the instance on the hypervisor.
Dec 05 06:23:26 compute-0 nova_compute[186329]: 2025-12-05 06:23:26.099 186333 DEBUG nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:23:26 compute-0 nova_compute[186329]: 2025-12-05 06:23:26.619 186333 INFO nova.compute.manager [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Took 14.08 seconds to build instance.
Dec 05 06:23:26 compute-0 nova_compute[186329]: 2025-12-05 06:23:26.723 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.035 186333 DEBUG nova.compute.manager [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.035 186333 DEBUG oslo_concurrency.lockutils [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.035 186333 DEBUG oslo_concurrency.lockutils [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.035 186333 DEBUG oslo_concurrency.lockutils [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.035 186333 DEBUG nova.compute.manager [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] No waiting events found dispatching network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.036 186333 WARNING nova.compute.manager [req-2088eb92-c182-49f5-8cd7-735b3fe6ee04 req-74d200cb-8ae3-4ca2-b28e-e1a9749a552f fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received unexpected event network-vif-plugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 for instance with vm_state active and task_state None.
Dec 05 06:23:27 compute-0 nova_compute[186329]: 2025-12-05 06:23:27.123 186333 DEBUG oslo_concurrency.lockutils [None req-94855f1b-3abd-410f-9b1f-8873717fa68d 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:28 compute-0 nova_compute[186329]: 2025-12-05 06:23:28.821 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:29.507 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:29.507 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:29.508 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:29 compute-0 podman[196599]: time="2025-12-05T06:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:23:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:23:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3043 "" "Go-http-client/1.1"
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: ERROR   06:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: ERROR   06:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: ERROR   06:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: ERROR   06:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: ERROR   06:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:23:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:23:31 compute-0 nova_compute[186329]: 2025-12-05 06:23:31.725 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:33 compute-0 nova_compute[186329]: 2025-12-05 06:23:33.823 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:35 compute-0 podman[211558]: 2025-12-05 06:23:35.480374479 +0000 UTC m=+0.061339477 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:23:35 compute-0 podman[211556]: 2025-12-05 06:23:35.494377642 +0000 UTC m=+0.078474069 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 05 06:23:35 compute-0 podman[211557]: 2025-12-05 06:23:35.497390067 +0000 UTC m=+0.079495870 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:23:35 compute-0 ovn_controller[95223]: 2025-12-05T06:23:35Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:12:a1:86 10.100.0.9
Dec 05 06:23:35 compute-0 ovn_controller[95223]: 2025-12-05T06:23:35Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:12:a1:86 10.100.0.9
Dec 05 06:23:36 compute-0 nova_compute[186329]: 2025-12-05 06:23:36.729 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:38 compute-0 nova_compute[186329]: 2025-12-05 06:23:38.781 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Creating tmpfile /var/lib/nova/instances/tmpfdb2kzba to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:23:38 compute-0 nova_compute[186329]: 2025-12-05 06:23:38.782 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:38 compute-0 nova_compute[186329]: 2025-12-05 06:23:38.789 186333 DEBUG nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdb2kzba',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:23:38 compute-0 nova_compute[186329]: 2025-12-05 06:23:38.824 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:40 compute-0 nova_compute[186329]: 2025-12-05 06:23:40.821 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:41 compute-0 nova_compute[186329]: 2025-12-05 06:23:41.732 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:43 compute-0 nova_compute[186329]: 2025-12-05 06:23:43.828 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:44 compute-0 nova_compute[186329]: 2025-12-05 06:23:44.756 186333 DEBUG nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdb2kzba',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89d73880-ffbb-49c5-9e2d-a49a64c44523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:23:45 compute-0 nova_compute[186329]: 2025-12-05 06:23:45.766 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:23:45 compute-0 nova_compute[186329]: 2025-12-05 06:23:45.766 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:23:45 compute-0 nova_compute[186329]: 2025-12-05 06:23:45.766 186333 DEBUG nova.network.neutron [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:23:46 compute-0 nova_compute[186329]: 2025-12-05 06:23:46.271 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:46 compute-0 nova_compute[186329]: 2025-12-05 06:23:46.734 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:46 compute-0 nova_compute[186329]: 2025-12-05 06:23:46.773 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:46 compute-0 nova_compute[186329]: 2025-12-05 06:23:46.880 186333 DEBUG nova.network.neutron [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Updating instance_info_cache with network_info: [{"id": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "address": "fa:16:3e:f6:fd:2f", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66b62d9b-f2", "ovs_interfaceid": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.383 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.392 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdb2kzba',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89d73880-ffbb-49c5-9e2d-a49a64c44523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.392 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Creating instance directory: /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.393 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Creating disk.info with the contents: {'/var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk': 'qcow2', '/var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.393 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.393 186333 DEBUG nova.objects.instance [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.897 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.900 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.901 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.942 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.943 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.943 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.944 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.946 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.946 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.985 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:47 compute-0 nova_compute[186329]: 2025-12-05 06:23:47.986 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.001 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.001 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.002 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.002 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.002 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.004 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.005 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.061s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.005 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.016 186333 INFO nova.compute.manager [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Terminating instance
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.047 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.047 186333 DEBUG nova.virt.disk.api [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.048 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.091 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.091 186333 DEBUG nova.virt.disk.api [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.092 186333 DEBUG nova.objects.instance [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.525 186333 DEBUG nova.compute.manager [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:23:48 compute-0 kernel: tap3f0185ae-48 (unregistering): left promiscuous mode
Dec 05 06:23:48 compute-0 NetworkManager[55434]: <info>  [1764915828.5478] device (tap3f0185ae-48): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.551 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 ovn_controller[95223]: 2025-12-05T06:23:48Z|00134|binding|INFO|Releasing lport 3f0185ae-482f-4962-94bc-a3aea2d2ea77 from this chassis (sb_readonly=0)
Dec 05 06:23:48 compute-0 ovn_controller[95223]: 2025-12-05T06:23:48Z|00135|binding|INFO|Setting lport 3f0185ae-482f-4962-94bc-a3aea2d2ea77 down in Southbound
Dec 05 06:23:48 compute-0 ovn_controller[95223]: 2025-12-05T06:23:48Z|00136|binding|INFO|Removing iface tap3f0185ae-48 ovn-installed in OVS
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.554 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.558 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:a1:86 10.100.0.9'], port_security=['fa:16:3e:12:a1:86 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ed77332-76f2-4438-9d44-961742779ff9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=3f0185ae-482f-4962-94bc-a3aea2d2ea77) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.559 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 3f0185ae-482f-4962-94bc-a3aea2d2ea77 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.560 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.561 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[df907045-a630-4d9d-976b-7d5c9bd0aba3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.561 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace which is not needed anymore
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.575 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec 05 06:23:48 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 11.155s CPU time.
Dec 05 06:23:48 compute-0 systemd-machined[152967]: Machine qemu-11-instance-0000000f terminated.
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.597 186333 DEBUG nova.objects.base [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<89d73880-ffbb-49c5-9e2d-a49a64c44523> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.597 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.625 186333 DEBUG oslo_concurrency.processutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk.config 497664" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.626 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.627 186333 DEBUG nova.virt.libvirt.vif [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:22:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-1008477261',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-100',id=14,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:23:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-x2b0o64y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:23:07Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=89d73880-ffbb-49c5-9e2d-a49a64c44523,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "address": "fa:16:3e:f6:fd:2f", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66b62d9b-f2", "ovs_interfaceid": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.627 186333 DEBUG nova.network.os_vif_util [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "address": "fa:16:3e:f6:fd:2f", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap66b62d9b-f2", "ovs_interfaceid": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.628 186333 DEBUG nova.network.os_vif_util [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f6:fd:2f,bridge_name='br-int',has_traffic_filtering=True,id=66b62d9b-f25d-401b-b95c-0c8b3982f733,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66b62d9b-f2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.628 186333 DEBUG os_vif [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:fd:2f,bridge_name='br-int',has_traffic_filtering=True,id=66b62d9b-f25d-401b-b95c-0c8b3982f733,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66b62d9b-f2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.629 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.629 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.629 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.630 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.630 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6d89d232-585a-525d-a7c9-ada63db29a4b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.631 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.633 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.634 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.636 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.636 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66b62d9b-f2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.637 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap66b62d9b-f2, col_values=(('qos', UUID('b5c8c261-77fa-4e81-bcc5-0774d45f3b56')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.637 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap66b62d9b-f2, col_values=(('external_ids', {'iface-id': '66b62d9b-f25d-401b-b95c-0c8b3982f733', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f6:fd:2f', 'vm-uuid': '89d73880-ffbb-49c5-9e2d-a49a64c44523'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.638 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 NetworkManager[55434]: <info>  [1764915828.6395] manager: (tap66b62d9b-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.640 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:23:48 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [NOTICE]   (211543) : haproxy version is 3.0.5-8e879a5
Dec 05 06:23:48 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [NOTICE]   (211543) : path to executable is /usr/sbin/haproxy
Dec 05 06:23:48 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [WARNING]  (211543) : Exiting Master process...
Dec 05 06:23:48 compute-0 podman[211650]: 2025-12-05 06:23:48.644567823 +0000 UTC m=+0.023848571 container kill 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:23:48 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [ALERT]    (211543) : Current worker (211545) exited with code 143 (Terminated)
Dec 05 06:23:48 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211539]: [WARNING]  (211543) : All workers exited. Exiting... (0)
Dec 05 06:23:48 compute-0 systemd[1]: libpod-2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040.scope: Deactivated successfully.
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.648 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.648 186333 INFO os_vif [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f6:fd:2f,bridge_name='br-int',has_traffic_filtering=True,id=66b62d9b-f25d-401b-b95c-0c8b3982f733,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66b62d9b-f2')
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.649 186333 DEBUG nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.649 186333 DEBUG nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdb2kzba',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89d73880-ffbb-49c5-9e2d-a49a64c44523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.650 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:48 compute-0 podman[211666]: 2025-12-05 06:23:48.671894333 +0000 UTC m=+0.016155493 container died 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.676 186333 DEBUG nova.compute.manager [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.677 186333 DEBUG oslo_concurrency.lockutils [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.677 186333 DEBUG oslo_concurrency.lockutils [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.677 186333 DEBUG oslo_concurrency.lockutils [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.677 186333 DEBUG nova.compute.manager [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] No waiting events found dispatching network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.678 186333 DEBUG nova.compute.manager [req-db73a5bc-8e2c-4a59-bbea-71aaced5b4fe req-a09af749-a961-4c31-86e5-78b1ccd47fc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:23:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040-userdata-shm.mount: Deactivated successfully.
Dec 05 06:23:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-14e78fb306d8605ace27d70f906f3e1a2bbd2856c00f4d414b28da557d342bc4-merged.mount: Deactivated successfully.
Dec 05 06:23:48 compute-0 podman[211666]: 2025-12-05 06:23:48.689214516 +0000 UTC m=+0.033475676 container cleanup 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:23:48 compute-0 systemd[1]: libpod-conmon-2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040.scope: Deactivated successfully.
Dec 05 06:23:48 compute-0 podman[211674]: 2025-12-05 06:23:48.699795905 +0000 UTC m=+0.031091401 container remove 2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.703 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a5fd54-e246-4bab-b97a-af8c7a85efc3]: (4, ("Fri Dec  5 06:23:48 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040)\n2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040\nFri Dec  5 06:23:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040)\n2c2b9ae202f28c327c77802448f86731f2ac0757b43e672c377a02d2bcac0040\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.704 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c9eac39b-e86c-40d6-81fb-acd6f42c4750]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.704 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.705 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe896aa-0f77-4a04-9fd5-4348a28f2b4b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.705 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:48 compute-0 kernel: tap8c6607ab-30: left promiscuous mode
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.706 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.723 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.725 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[de66c977-caee-444e-8eab-a2d3382c212b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.735 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b314bc-55ba-44c7-8234-d15f0d7a765d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.736 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[feea4e19-f65c-4342-a18d-dfc0c435352a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.748 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2c247ba8-e8d5-4452-86b3-b65aaaae2b86]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368336, 'reachable_time': 37019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211702, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d8c6607ab\x2d315b\x2d4ce0\x2dbb4d\x2de22d0d588c81.mount: Deactivated successfully.
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.752 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:23:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:48.752 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[5f78db64-717c-4025-9595-12529ae244ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.772 186333 INFO nova.virt.libvirt.driver [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Instance destroyed successfully.
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.772 186333 DEBUG nova.objects.instance [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lazy-loading 'resources' on Instance uuid 8ed77332-76f2-4438-9d44-961742779ff9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:23:48 compute-0 nova_compute[186329]: 2025-12-05 06:23:48.829 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.277 186333 DEBUG nova.virt.libvirt.vif [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:23:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteNodeResourceConsolidationStrategy-server-496915993',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutenoderesourceconsolidationstrategy-server-496',id=15,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:23:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bc5d63a38e00424aa78cb06b6b41bc09',ramdisk_id='',reservation_id='r-973d39j5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411',owner_user_name='tempest-TestExecuteNodeResourceConsolidationStrategy-409405411-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:23:26Z,user_data=None,user_id='1b1d9849dd3f4328991385825f24dc8f',uuid=8ed77332-76f2-4438-9d44-961742779ff9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.277 186333 DEBUG nova.network.os_vif_util [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converting VIF {"id": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "address": "fa:16:3e:12:a1:86", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f0185ae-48", "ovs_interfaceid": "3f0185ae-482f-4962-94bc-a3aea2d2ea77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.278 186333 DEBUG nova.network.os_vif_util [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.278 186333 DEBUG os_vif [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.279 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.279 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f0185ae-48, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.280 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.282 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.284 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.285 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.285 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6685d8f5-fe2d-46e0-9dc8-71a6c0669a36) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.285 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.287 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.288 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.289 186333 INFO os_vif [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:a1:86,bridge_name='br-int',has_traffic_filtering=True,id=3f0185ae-482f-4962-94bc-a3aea2d2ea77,network=Network(8c6607ab-315b-4ce0-bb4d-e22d0d588c81),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f0185ae-48')
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.289 186333 INFO nova.virt.libvirt.driver [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Deleting instance files /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9_del
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.290 186333 INFO nova.virt.libvirt.driver [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Deletion of /var/lib/nova/instances/8ed77332-76f2-4438-9d44-961742779ff9_del complete
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.547 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.800 186333 INFO nova.compute.manager [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Took 1.27 seconds to destroy the instance on the hypervisor.
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.800 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.801 186333 DEBUG nova.compute.manager [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.801 186333 DEBUG nova.network.neutron [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:23:49 compute-0 nova_compute[186329]: 2025-12-05 06:23:49.801 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.552 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:23:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:50.571 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.571 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:50.573 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.728 186333 DEBUG nova.compute.manager [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.729 186333 DEBUG oslo_concurrency.lockutils [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8ed77332-76f2-4438-9d44-961742779ff9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.729 186333 DEBUG oslo_concurrency.lockutils [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.729 186333 DEBUG oslo_concurrency.lockutils [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.729 186333 DEBUG nova.compute.manager [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] No waiting events found dispatching network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:23:50 compute-0 nova_compute[186329]: 2025-12-05 06:23:50.729 186333 DEBUG nova.compute.manager [req-50d3ec93-8e65-4946-8068-c74f3ff8925a req-58b387f8-a1f0-4bb6-a766-08ff3fd5aa84 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-unplugged-3f0185ae-482f-4962-94bc-a3aea2d2ea77 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.598 186333 DEBUG nova.compute.manager [req-8264b5df-f74b-420f-b71a-e8f1dbe0b2c0 req-827a2f15-5704-4a7d-9963-518142fb9ada fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Received event network-vif-deleted-3f0185ae-482f-4962-94bc-a3aea2d2ea77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.598 186333 INFO nova.compute.manager [req-8264b5df-f74b-420f-b71a-e8f1dbe0b2c0 req-827a2f15-5704-4a7d-9963-518142fb9ada fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Neutron deleted interface 3f0185ae-482f-4962-94bc-a3aea2d2ea77; detaching it from the instance and deleting it from the info cache
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.598 186333 DEBUG nova.network.neutron [req-8264b5df-f74b-420f-b71a-e8f1dbe0b2c0 req-827a2f15-5704-4a7d-9963-518142fb9ada fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:23:52 compute-0 nova_compute[186329]: 2025-12-05 06:23:52.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.057 186333 DEBUG nova.network.neutron [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.102 186333 DEBUG nova.compute.manager [req-8264b5df-f74b-420f-b71a-e8f1dbe0b2c0 req-827a2f15-5704-4a7d-9963-518142fb9ada fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Detach interface failed, port_id=3f0185ae-482f-4962-94bc-a3aea2d2ea77, reason: Instance 8ed77332-76f2-4438-9d44-961742779ff9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.221 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.398 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.399 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.415 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.416 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5864MB free_disk=73.16653442382812GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.416 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.416 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.532 186333 DEBUG nova.network.neutron [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Port 66b62d9b-f25d-401b-b95c-0c8b3982f733 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.538 186333 DEBUG nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdb2kzba',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='89d73880-ffbb-49c5-9e2d-a49a64c44523',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.562 186333 INFO nova.compute.manager [-] [instance: 8ed77332-76f2-4438-9d44-961742779ff9] Took 3.76 seconds to deallocate network for instance.
Dec 05 06:23:53 compute-0 nova_compute[186329]: 2025-12-05 06:23:53.832 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:54 compute-0 nova_compute[186329]: 2025-12-05 06:23:54.072 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:23:54 compute-0 nova_compute[186329]: 2025-12-05 06:23:54.285 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:54 compute-0 nova_compute[186329]: 2025-12-05 06:23:54.430 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:23:54 compute-0 nova_compute[186329]: 2025-12-05 06:23:54.934 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Updating resource usage from migration 199cbc91-6cce-41f3-8181-2ad3666fc5f5
Dec 05 06:23:54 compute-0 nova_compute[186329]: 2025-12-05 06:23:54.935 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Starting to track incoming migration 199cbc91-6cce-41f3-8181-2ad3666fc5f5 with flavor cb13e320-971c-46c2-a935-d695f3631bf8 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:23:55 compute-0 podman[211724]: 2025-12-05 06:23:55.460780239 +0000 UTC m=+0.045386936 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:23:55 compute-0 podman[211723]: 2025-12-05 06:23:55.486874312 +0000 UTC m=+0.072950662 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 05 06:23:55 compute-0 kernel: tap66b62d9b-f2: entered promiscuous mode
Dec 05 06:23:55 compute-0 nova_compute[186329]: 2025-12-05 06:23:55.880 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:55 compute-0 ovn_controller[95223]: 2025-12-05T06:23:55Z|00137|binding|INFO|Claiming lport 66b62d9b-f25d-401b-b95c-0c8b3982f733 for this additional chassis.
Dec 05 06:23:55 compute-0 NetworkManager[55434]: <info>  [1764915835.8813] manager: (tap66b62d9b-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Dec 05 06:23:55 compute-0 ovn_controller[95223]: 2025-12-05T06:23:55Z|00138|binding|INFO|66b62d9b-f25d-401b-b95c-0c8b3982f733: Claiming fa:16:3e:f6:fd:2f 10.100.0.13
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.887 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:fd:2f 10.100.0.13'], port_security=['fa:16:3e:f6:fd:2f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '89d73880-ffbb-49c5-9e2d-a49a64c44523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '10', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=66b62d9b-f25d-401b-b95c-0c8b3982f733) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.888 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 66b62d9b-f25d-401b-b95c-0c8b3982f733 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.889 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.897 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb7addd-8d01-44c3-bd3f-297980bc1c8f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.898 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c6607ab-31 in ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.900 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c6607ab-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.900 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[25cd9186-4714-486b-bf22-6b815e21a81f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.900 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7b29fc43-56e3-4833-aed6-91afecc5d044]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 nova_compute[186329]: 2025-12-05 06:23:55.904 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:55 compute-0 nova_compute[186329]: 2025-12-05 06:23:55.908 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:55 compute-0 ovn_controller[95223]: 2025-12-05T06:23:55Z|00139|binding|INFO|Setting lport 66b62d9b-f25d-401b-b95c-0c8b3982f733 ovn-installed in OVS
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.908 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f09402-bbaf-4018-831a-f5a716613492]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 systemd-udevd[211785]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:23:55 compute-0 systemd-machined[152967]: New machine qemu-12-instance-0000000e.
Dec 05 06:23:55 compute-0 NetworkManager[55434]: <info>  [1764915835.9244] device (tap66b62d9b-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.924 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0e9a1357-2e12-4be2-a2a6-3ff846cca438]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 NetworkManager[55434]: <info>  [1764915835.9254] device (tap66b62d9b-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:23:55 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000e.
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.942 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[ce924834-09b5-4b1d-b6e1-b6b3a65555dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.945 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6508b82c-0202-4229-a3b7-497f8ef20257]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 NetworkManager[55434]: <info>  [1764915835.9457] manager: (tap8c6607ab-30): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.966 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[705ed58c-bc13-4e90-9dec-b0b6f032f270]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 nova_compute[186329]: 2025-12-05 06:23:55.968 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:23:55 compute-0 nova_compute[186329]: 2025-12-05 06:23:55.968 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 8ed77332-76f2-4438-9d44-961742779ff9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.969 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eecfd5-59cb-440a-a3f6-932819fb1dd8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:55 compute-0 NetworkManager[55434]: <info>  [1764915835.9853] device (tap8c6607ab-30): carrier: link connected
Dec 05 06:23:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:55.988 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[c6456324-973b-42d8-9693-a6e208723ef3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.000 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b55b7d4e-2e46-478f-a38d-1070cde240ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371452, 'reachable_time': 33769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211808, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.011 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[09ae5fe7-570f-494e-9543-408e7c3c1efb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1d:777c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371452, 'tstamp': 371452}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211809, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.023 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c441459b-93ae-40d3-814e-824fff34ed29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c6607ab-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1d:77:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371452, 'reachable_time': 33769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211810, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.042 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a122e2-d90f-47e5-a125-d897a766c6d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.078 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[074fb05e-b221-4149-9dfd-162f9b09458e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.079 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.079 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.081 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c6607ab-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.083 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:56 compute-0 kernel: tap8c6607ab-30: entered promiscuous mode
Dec 05 06:23:56 compute-0 NetworkManager[55434]: <info>  [1764915836.0838] manager: (tap8c6607ab-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.086 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c6607ab-30, col_values=(('external_ids', {'iface-id': '23184677-e308-4f95-b4f6-6e02e8b7fc45'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.087 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:56 compute-0 ovn_controller[95223]: 2025-12-05T06:23:56Z|00140|binding|INFO|Releasing lport 23184677-e308-4f95-b4f6-6e02e8b7fc45 from this chassis (sb_readonly=0)
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.088 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6a9ad4a5-f6ba-4510-bf49-3de581b890f0]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.103 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.103 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.103 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.103 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.103 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.104 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b41bbc77-420f-4814-a857-73e923bd9603]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.104 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.105 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9e3387de-a305-4154-98d5-c33133a56d74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.105 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 8c6607ab-315b-4ce0-bb4d-e22d0d588c81
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:23:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:56.105 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'env', 'PROCESS_TAG=haproxy-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:23:56 compute-0 podman[211845]: 2025-12-05 06:23:56.407128497 +0000 UTC m=+0.030979100 container create 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 05 06:23:56 compute-0 systemd[1]: Started libpod-conmon-9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57.scope.
Dec 05 06:23:56 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:23:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f98aa816846180de79609db47eb76032efdd57f08263cea117a430265d623a25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:23:56 compute-0 podman[211845]: 2025-12-05 06:23:56.471472332 +0000 UTC m=+0.095322935 container init 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.472 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.473 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.473 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:23:53 up  1:01,  0 user,  load average: 0.18, 0.17, 0.26\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_bc5d63a38e00424aa78cb06b6b41bc09': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:23:56 compute-0 podman[211845]: 2025-12-05 06:23:56.475683783 +0000 UTC m=+0.099534386 container start 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:23:56 compute-0 podman[211845]: 2025-12-05 06:23:56.394042277 +0000 UTC m=+0.017892890 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:23:56 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [NOTICE]   (211862) : New worker (211864) forked
Dec 05 06:23:56 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [NOTICE]   (211862) : Loading success.
Dec 05 06:23:56 compute-0 nova_compute[186329]: 2025-12-05 06:23:56.523 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:23:57 compute-0 nova_compute[186329]: 2025-12-05 06:23:57.028 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:23:57 compute-0 nova_compute[186329]: 2025-12-05 06:23:57.541 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:23:57 compute-0 nova_compute[186329]: 2025-12-05 06:23:57.542 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:57 compute-0 nova_compute[186329]: 2025-12-05 06:23:57.542 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 3.470s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:23:57 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:23:57.575 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:23:57 compute-0 nova_compute[186329]: 2025-12-05 06:23:57.595 186333 DEBUG nova.compute.provider_tree [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.100 186333 DEBUG nova.scheduler.client.report [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.538 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.538 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.539 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.539 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.607 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.065s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.622 186333 INFO nova.scheduler.client.report [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Deleted allocations for instance 8ed77332-76f2-4438-9d44-961742779ff9
Dec 05 06:23:58 compute-0 ovn_controller[95223]: 2025-12-05T06:23:58Z|00141|binding|INFO|Claiming lport 66b62d9b-f25d-401b-b95c-0c8b3982f733 for this chassis.
Dec 05 06:23:58 compute-0 ovn_controller[95223]: 2025-12-05T06:23:58Z|00142|binding|INFO|66b62d9b-f25d-401b-b95c-0c8b3982f733: Claiming fa:16:3e:f6:fd:2f 10.100.0.13
Dec 05 06:23:58 compute-0 ovn_controller[95223]: 2025-12-05T06:23:58Z|00143|binding|INFO|Setting lport 66b62d9b-f25d-401b-b95c-0c8b3982f733 up in Southbound
Dec 05 06:23:58 compute-0 nova_compute[186329]: 2025-12-05 06:23:58.834 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:59 compute-0 nova_compute[186329]: 2025-12-05 06:23:59.286 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:23:59 compute-0 nova_compute[186329]: 2025-12-05 06:23:59.640 186333 DEBUG oslo_concurrency.lockutils [None req-09b1ca87-d466-42d6-b987-44fa45ffd684 1b1d9849dd3f4328991385825f24dc8f bc5d63a38e00424aa78cb06b6b41bc09 - - default default] Lock "8ed77332-76f2-4438-9d44-961742779ff9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.638s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:23:59 compute-0 nova_compute[186329]: 2025-12-05 06:23:59.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:23:59 compute-0 podman[196599]: time="2025-12-05T06:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:23:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:23:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3044 "" "Go-http-client/1.1"
Dec 05 06:24:00 compute-0 nova_compute[186329]: 2025-12-05 06:24:00.819 186333 INFO nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Post operation of migration started
Dec 05 06:24:00 compute-0 nova_compute[186329]: 2025-12-05 06:24:00.819 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:24:00 compute-0 nova_compute[186329]: 2025-12-05 06:24:00.911 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:24:00 compute-0 nova_compute[186329]: 2025-12-05 06:24:00.911 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: ERROR   06:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: ERROR   06:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: ERROR   06:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: ERROR   06:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: ERROR   06:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:24:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:24:01 compute-0 nova_compute[186329]: 2025-12-05 06:24:01.539 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:24:01 compute-0 nova_compute[186329]: 2025-12-05 06:24:01.539 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:24:01 compute-0 nova_compute[186329]: 2025-12-05 06:24:01.539 186333 DEBUG nova.network.neutron [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.043 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:24:02 compute-0 ovn_controller[95223]: 2025-12-05T06:24:02Z|00144|binding|INFO|Removing iface tap66b62d9b-f2 ovn-installed in OVS
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.727 104041 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fcf95640-9e1f-4288-a6d2-6f9cfce3bbd5 with type ""
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.728 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:fd:2f 10.100.0.13'], port_security=['fa:16:3e:f6:fd:2f 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '89d73880-ffbb-49c5-9e2d-a49a64c44523', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bc5d63a38e00424aa78cb06b6b41bc09', 'neutron:revision_number': '15', 'neutron:security_group_ids': '7b69bd8c-f5f0-4d39-bcf7-4157a4a0e235', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d520596c-b299-4696-9930-0280ad4bb232, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=66b62d9b-f25d-401b-b95c-0c8b3982f733) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:24:02 compute-0 ovn_controller[95223]: 2025-12-05T06:24:02Z|00145|binding|INFO|Removing lport 66b62d9b-f25d-401b-b95c-0c8b3982f733 ovn-installed in OVS
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.730 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 66b62d9b-f25d-401b-b95c-0c8b3982f733 in datapath 8c6607ab-315b-4ce0-bb4d-e22d0d588c81 unbound from our chassis
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.730 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.731 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c6607ab-315b-4ce0-bb4d-e22d0d588c81, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.732 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8793f0c6-ced1-424b-ac8d-f3ab6701a8ae]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.732 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 namespace which is not needed anymore
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.740 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.805 186333 WARNING neutronclient.v2_0.client [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:24:02 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [NOTICE]   (211862) : haproxy version is 3.0.5-8e879a5
Dec 05 06:24:02 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [NOTICE]   (211862) : path to executable is /usr/sbin/haproxy
Dec 05 06:24:02 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [WARNING]  (211862) : Exiting Master process...
Dec 05 06:24:02 compute-0 podman[211889]: 2025-12-05 06:24:02.815201436 +0000 UTC m=+0.021548294 container kill 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 05 06:24:02 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [ALERT]    (211862) : Current worker (211864) exited with code 143 (Terminated)
Dec 05 06:24:02 compute-0 neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81[211858]: [WARNING]  (211862) : All workers exited. Exiting... (0)
Dec 05 06:24:02 compute-0 systemd[1]: libpod-9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57.scope: Deactivated successfully.
Dec 05 06:24:02 compute-0 podman[211902]: 2025-12-05 06:24:02.847402713 +0000 UTC m=+0.016822979 container died 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:24:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57-userdata-shm.mount: Deactivated successfully.
Dec 05 06:24:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-f98aa816846180de79609db47eb76032efdd57f08263cea117a430265d623a25-merged.mount: Deactivated successfully.
Dec 05 06:24:02 compute-0 podman[211902]: 2025-12-05 06:24:02.870729892 +0000 UTC m=+0.040150148 container cleanup 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:24:02 compute-0 systemd[1]: libpod-conmon-9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57.scope: Deactivated successfully.
Dec 05 06:24:02 compute-0 podman[211903]: 2025-12-05 06:24:02.878811962 +0000 UTC m=+0.045533102 container remove 9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.882 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8332974b-6537-4cc4-8b08-79ba0e153e1c]: (4, ("Fri Dec  5 06:24:02 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57)\n9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57\nFri Dec  5 06:24:02 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 (9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57)\n9c87e723715f5389f025c6d78752caec119c2e7bd16d699f2e45247ab18c6b57\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.883 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6dad4e76-7c0a-4ea2-83f0-acfbc79530c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.883 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c6607ab-315b-4ce0-bb4d-e22d0d588c81.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.884 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7b29ee-02d8-499d-991c-f7ab325d5c91]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.884 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c6607ab-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:24:02 compute-0 kernel: tap8c6607ab-30: left promiscuous mode
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.886 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.897 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.899 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7a60740c-ddb4-4e3a-a1e2-618cccbe7296]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.911 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dea613d5-399b-406b-843f-1fc284e3955b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.911 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7fde8c1c-d994-4197-9aa5-415aafdbdc04]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 nova_compute[186329]: 2025-12-05 06:24:02.923 186333 DEBUG nova.network.neutron [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Updating instance_info_cache with network_info: [{"id": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "address": "fa:16:3e:f6:fd:2f", "network": {"id": "8c6607ab-315b-4ce0-bb4d-e22d0d588c81", "bridge": "br-int", "label": "tempest-TestExecuteNodeResourceConsolidationStrategy-1945444158-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fa7b2a65c9a54b598b902ce6fa21d41e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66b62d9b-f2", "ovs_interfaceid": "66b62d9b-f25d-401b-b95c-0c8b3982f733", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.923 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[12284ff5-7bbe-4920-bd77-ab0f55bd2f1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371447, 'reachable_time': 30918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211926, 'error': None, 'target': 'ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d8c6607ab\x2d315b\x2d4ce0\x2dbb4d\x2de22d0d588c81.mount: Deactivated successfully.
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.927 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c6607ab-315b-4ce0-bb4d-e22d0d588c81 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:24:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:02.927 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[94fb6ebe-b42b-4e23-a928-ac12cb451c58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.427 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.836 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.940 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.940 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.940 186333 DEBUG oslo_concurrency.lockutils [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:24:03 compute-0 nova_compute[186329]: 2025-12-05 06:24:03.943 186333 INFO nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:24:03 compute-0 virtqemud[186605]: Domain id=12 name='instance-0000000e' uuid=89d73880-ffbb-49c5-9e2d-a49a64c44523 is tainted: custom-monitor
Dec 05 06:24:04 compute-0 nova_compute[186329]: 2025-12-05 06:24:04.287 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:04 compute-0 nova_compute[186329]: 2025-12-05 06:24:04.948 186333 INFO nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:24:05 compute-0 nova_compute[186329]: 2025-12-05 06:24:05.953 186333 INFO nova.virt.libvirt.driver [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:24:05 compute-0 nova_compute[186329]: 2025-12-05 06:24:05.956 186333 DEBUG nova.compute.manager [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:24:06 compute-0 nova_compute[186329]: 2025-12-05 06:24:06.464 186333 DEBUG nova.objects.instance [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:24:06 compute-0 podman[211939]: 2025-12-05 06:24:06.484261771 +0000 UTC m=+0.069312211 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 05 06:24:06 compute-0 podman[211940]: 2025-12-05 06:24:06.489462442 +0000 UTC m=+0.073662090 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec 05 06:24:06 compute-0 podman[211941]: 2025-12-05 06:24:06.492565407 +0000 UTC m=+0.074720010 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server [None req-0ba7e029-5e36-43bd-9ca1-9a7f763ba603 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Exception during message handling: nova.exception_Remote.InstanceNotFound_Remote: Instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 could not be found.
Dec 05 06:24:07 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:24:07 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:24:07 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:24:07 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:24:07 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:24:07 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:24:07 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:24:07 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:24:07 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:24:07 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 
Dec 05 06:24:07 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 could not be found.
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     updates, result = self.indirection_api.object_action(
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     result = self.transport._send(
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return self._driver.send(target, ctxt, message,
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     raise result
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server nova.exception_Remote.InstanceNotFound_Remote: Instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 could not be found.
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return getattr(target, method)(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return fn(self, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     return f(context, *args, **kwargs)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server nova.exception.InstanceNotFound: Instance 89d73880-ffbb-49c5-9e2d-a49a64c44523 could not be found.
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:07 compute-0 nova_compute[186329]: 2025-12-05 06:24:07.991 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:24:08 compute-0 nova_compute[186329]: 2025-12-05 06:24:08.837 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:09 compute-0 nova_compute[186329]: 2025-12-05 06:24:09.288 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:13 compute-0 nova_compute[186329]: 2025-12-05 06:24:13.839 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:14 compute-0 nova_compute[186329]: 2025-12-05 06:24:14.290 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:15 compute-0 nova_compute[186329]: 2025-12-05 06:24:15.045 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:18 compute-0 nova_compute[186329]: 2025-12-05 06:24:18.841 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:19 compute-0 nova_compute[186329]: 2025-12-05 06:24:19.291 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:23 compute-0 nova_compute[186329]: 2025-12-05 06:24:23.842 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:24 compute-0 nova_compute[186329]: 2025-12-05 06:24:24.291 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:26.253 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:cb:f5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '698ff04a95fc4ac6be1eb0e1dbe01302', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359366c3-1c13-443d-9597-209e3af69d9c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5199b61b-18e1-4d88-bc76-49efa68df776) old=Port_Binding(mac=['fa:16:3e:00:cb:f5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '698ff04a95fc4ac6be1eb0e1dbe01302', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:24:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:26.254 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5199b61b-18e1-4d88-bc76-49efa68df776 in datapath 22a11164-8d14-45f7-8928-10d564f2f223 updated
Dec 05 06:24:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:26.255 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22a11164-8d14-45f7-8928-10d564f2f223, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:24:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:26.255 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c32f0e21-0ae2-4103-ab77-633bebd9c8c1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:26 compute-0 podman[211993]: 2025-12-05 06:24:26.464605045 +0000 UTC m=+0.042655699 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:24:26 compute-0 podman[211992]: 2025-12-05 06:24:26.481443622 +0000 UTC m=+0.062108282 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 06:24:28 compute-0 nova_compute[186329]: 2025-12-05 06:24:28.844 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:29 compute-0 nova_compute[186329]: 2025-12-05 06:24:29.292 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:29.509 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:24:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:29.509 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:24:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:29.509 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:24:29 compute-0 podman[196599]: time="2025-12-05T06:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:24:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:24:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: ERROR   06:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: ERROR   06:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: ERROR   06:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: ERROR   06:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: ERROR   06:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:24:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:24:33 compute-0 nova_compute[186329]: 2025-12-05 06:24:33.846 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:34 compute-0 nova_compute[186329]: 2025-12-05 06:24:34.294 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:37.380 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:39:43 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7c48ac6a-33dd-4fff-b346-6e258f5a288a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c48ac6a-33dd-4fff-b346-6e258f5a288a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=43b2286f-adca-492e-965a-132e8cc9fc6c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=98cf7373-8198-437f-ad5d-686357383993) old=Port_Binding(mac=['fa:16:3e:d5:39:43'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7c48ac6a-33dd-4fff-b346-6e258f5a288a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c48ac6a-33dd-4fff-b346-6e258f5a288a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:24:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:37.381 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 98cf7373-8198-437f-ad5d-686357383993 in datapath 7c48ac6a-33dd-4fff-b346-6e258f5a288a updated
Dec 05 06:24:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:37.382 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c48ac6a-33dd-4fff-b346-6e258f5a288a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:24:37 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:37.382 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[68480f60-c37f-4949-a86d-68d1f2a419ad]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:24:37 compute-0 podman[212039]: 2025-12-05 06:24:37.468441291 +0000 UTC m=+0.048248137 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 06:24:37 compute-0 podman[212041]: 2025-12-05 06:24:37.491191646 +0000 UTC m=+0.055743653 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:24:37 compute-0 podman[212040]: 2025-12-05 06:24:37.502753349 +0000 UTC m=+0.078350657 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Dec 05 06:24:38 compute-0 nova_compute[186329]: 2025-12-05 06:24:38.848 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:39 compute-0 nova_compute[186329]: 2025-12-05 06:24:39.294 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:43 compute-0 nova_compute[186329]: 2025-12-05 06:24:43.849 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:44 compute-0 nova_compute[186329]: 2025-12-05 06:24:44.296 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:48 compute-0 nova_compute[186329]: 2025-12-05 06:24:48.850 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:49 compute-0 nova_compute[186329]: 2025-12-05 06:24:49.297 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:49 compute-0 ovn_controller[95223]: 2025-12-05T06:24:49Z|00146|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 06:24:50 compute-0 nova_compute[186329]: 2025-12-05 06:24:50.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:52 compute-0 nova_compute[186329]: 2025-12-05 06:24:52.214 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:52 compute-0 nova_compute[186329]: 2025-12-05 06:24:52.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:53 compute-0 nova_compute[186329]: 2025-12-05 06:24:53.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:53 compute-0 nova_compute[186329]: 2025-12-05 06:24:53.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:53 compute-0 nova_compute[186329]: 2025-12-05 06:24:53.792 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:53.793 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:24:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:24:53.793 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:24:53 compute-0 nova_compute[186329]: 2025-12-05 06:24:53.852 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:54 compute-0 nova_compute[186329]: 2025-12-05 06:24:54.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:24:54 compute-0 nova_compute[186329]: 2025-12-05 06:24:54.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:24:54 compute-0 nova_compute[186329]: 2025-12-05 06:24:54.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:24:54 compute-0 nova_compute[186329]: 2025-12-05 06:24:54.220 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:24:54 compute-0 nova_compute[186329]: 2025-12-05 06:24:54.298 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.288 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.288 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.328 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.495 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.496 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.511 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.512 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.13817977905273GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.512 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:24:55 compute-0 nova_compute[186329]: 2025-12-05 06:24:55.512 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:24:57 compute-0 nova_compute[186329]: 2025-12-05 06:24:57.051 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:24:57 compute-0 nova_compute[186329]: 2025-12-05 06:24:57.052 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:24:57 compute-0 nova_compute[186329]: 2025-12-05 06:24:57.052 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:24:55 up  1:02,  0 user,  load average: 0.24, 0.21, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:24:57 compute-0 nova_compute[186329]: 2025-12-05 06:24:57.079 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:24:57 compute-0 podman[212102]: 2025-12-05 06:24:57.467393072 +0000 UTC m=+0.044309409 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:24:57 compute-0 podman[212101]: 2025-12-05 06:24:57.490573264 +0000 UTC m=+0.069899844 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:24:57 compute-0 nova_compute[186329]: 2025-12-05 06:24:57.584 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:24:58 compute-0 nova_compute[186329]: 2025-12-05 06:24:58.089 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:24:58 compute-0 nova_compute[186329]: 2025-12-05 06:24:58.090 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.577s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:24:58 compute-0 nova_compute[186329]: 2025-12-05 06:24:58.090 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:58 compute-0 nova_compute[186329]: 2025-12-05 06:24:58.090 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:24:58 compute-0 nova_compute[186329]: 2025-12-05 06:24:58.855 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:59 compute-0 nova_compute[186329]: 2025-12-05 06:24:59.300 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:24:59 compute-0 nova_compute[186329]: 2025-12-05 06:24:59.593 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:59 compute-0 nova_compute[186329]: 2025-12-05 06:24:59.593 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:59 compute-0 nova_compute[186329]: 2025-12-05 06:24:59.593 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:59 compute-0 nova_compute[186329]: 2025-12-05 06:24:59.593 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:24:59 compute-0 podman[196599]: time="2025-12-05T06:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:24:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:24:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:25:00 compute-0 nova_compute[186329]: 2025-12-05 06:25:00.106 186333 WARNING nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor.
Dec 05 06:25:00 compute-0 nova_compute[186329]: 2025-12-05 06:25:00.107 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:00 compute-0 nova_compute[186329]: 2025-12-05 06:25:00.107 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:25:01 compute-0 nova_compute[186329]: 2025-12-05 06:25:01.224 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: ERROR   06:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: ERROR   06:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: ERROR   06:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: ERROR   06:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: ERROR   06:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:25:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:25:03 compute-0 nova_compute[186329]: 2025-12-05 06:25:03.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:03 compute-0 nova_compute[186329]: 2025-12-05 06:25:03.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:25:03 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:03.794 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:03 compute-0 nova_compute[186329]: 2025-12-05 06:25:03.857 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:04 compute-0 nova_compute[186329]: 2025-12-05 06:25:04.215 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:25:04 compute-0 nova_compute[186329]: 2025-12-05 06:25:04.300 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:08 compute-0 podman[212148]: 2025-12-05 06:25:08.475501662 +0000 UTC m=+0.055438268 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd)
Dec 05 06:25:08 compute-0 podman[212146]: 2025-12-05 06:25:08.488463909 +0000 UTC m=+0.073591978 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:25:08 compute-0 podman[212147]: 2025-12-05 06:25:08.495681132 +0000 UTC m=+0.079363413 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 06:25:08 compute-0 nova_compute[186329]: 2025-12-05 06:25:08.858 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:09 compute-0 nova_compute[186329]: 2025-12-05 06:25:09.301 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:09 compute-0 nova_compute[186329]: 2025-12-05 06:25:09.454 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:09 compute-0 nova_compute[186329]: 2025-12-05 06:25:09.454 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:09 compute-0 nova_compute[186329]: 2025-12-05 06:25:09.957 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:25:10 compute-0 nova_compute[186329]: 2025-12-05 06:25:10.488 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:10 compute-0 nova_compute[186329]: 2025-12-05 06:25:10.488 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:10 compute-0 nova_compute[186329]: 2025-12-05 06:25:10.492 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:25:10 compute-0 nova_compute[186329]: 2025-12-05 06:25:10.492 186333 INFO nova.compute.claims [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:25:11 compute-0 nova_compute[186329]: 2025-12-05 06:25:11.540 186333 DEBUG nova.compute.provider_tree [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:25:12 compute-0 nova_compute[186329]: 2025-12-05 06:25:12.044 186333 DEBUG nova.scheduler.client.report [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:25:12 compute-0 nova_compute[186329]: 2025-12-05 06:25:12.550 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.062s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:12 compute-0 nova_compute[186329]: 2025-12-05 06:25:12.551 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.057 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.057 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.058 186333 WARNING neutronclient.v2_0.client [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.058 186333 WARNING neutronclient.v2_0.client [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.562 186333 INFO nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.860 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:13 compute-0 nova_compute[186329]: 2025-12-05 06:25:13.866 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Successfully created port: 22b7cda4-bb59-4fdf-812a-a071b6a4d13f _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.069 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.303 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.695 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Successfully updated port: 22b7cda4-bb59-4fdf-812a-a071b6a4d13f _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.736 186333 DEBUG nova.compute.manager [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-changed-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.736 186333 DEBUG nova.compute.manager [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Refreshing instance network info cache due to event network-changed-22b7cda4-bb59-4fdf-812a-a071b6a4d13f. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.736 186333 DEBUG oslo_concurrency.lockutils [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.736 186333 DEBUG oslo_concurrency.lockutils [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:25:14 compute-0 nova_compute[186329]: 2025-12-05 06:25:14.736 186333 DEBUG nova.network.neutron [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Refreshing network info cache for port 22b7cda4-bb59-4fdf-812a-a071b6a4d13f _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.079 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.080 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.083 186333 INFO nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Creating image(s)
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.084 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.084 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.085 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.085 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.087 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.089 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.130 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.131 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.131 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.131 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.134 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.134 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.173 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.174 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.192 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.192 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.061s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.193 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.200 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.234 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.235 186333 DEBUG nova.virt.disk.api [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Checking if we can resize image /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.235 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.241 186333 WARNING neutronclient.v2_0.client [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.277 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.277 186333 DEBUG nova.virt.disk.api [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Cannot resize image /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.278 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.278 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Ensure instance console log exists: /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.278 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.279 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.279 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.557 186333 DEBUG nova.network.neutron [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:25:15 compute-0 nova_compute[186329]: 2025-12-05 06:25:15.656 186333 DEBUG nova.network.neutron [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:25:16 compute-0 nova_compute[186329]: 2025-12-05 06:25:16.161 186333 DEBUG oslo_concurrency.lockutils [req-36e0a740-8805-4879-af9f-3519592bb558 req-9aedeab6-392d-412e-8183-77f193e874a1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:25:16 compute-0 nova_compute[186329]: 2025-12-05 06:25:16.161 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquired lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:25:16 compute-0 nova_compute[186329]: 2025-12-05 06:25:16.162 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:25:16 compute-0 nova_compute[186329]: 2025-12-05 06:25:16.736 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:25:16 compute-0 nova_compute[186329]: 2025-12-05 06:25:16.873 186333 WARNING neutronclient.v2_0.client [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.006 186333 DEBUG nova.network.neutron [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Updating instance_info_cache with network_info: [{"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.511 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Releasing lock "refresh_cache-0e7a5bec-7f70-404a-bb58-d6b499c04bae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.511 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance network_info: |[{"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.513 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Start _get_guest_xml network_info=[{"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.516 186333 WARNING nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.517 186333 DEBUG nova.virt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1820134413', uuid='0e7a5bec-7f70-404a-bb58-d6b499c04bae'), owner=OwnerMeta(userid='72f24e9b0fa74da299d3bfff79a1fd92', username='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin', projectid='4548dd99e0bd4ca59433132b59d02fcd', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764915917.517364) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.520 186333 DEBUG nova.virt.libvirt.host [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.521 186333 DEBUG nova.virt.libvirt.host [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.523 186333 DEBUG nova.virt.libvirt.host [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.523 186333 DEBUG nova.virt.libvirt.host [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.524 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.524 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.524 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.525 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.525 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.525 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.525 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.525 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.526 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.526 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.526 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.526 186333 DEBUG nova.virt.hardware [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.529 186333 DEBUG nova.virt.libvirt.vif [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:25:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1820134413',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1820134413',id=17,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4548dd99e0bd4ca59433132b59d02fcd',ramdisk_id='',reservation_id='r-lhdqy2m7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:25:14Z,user_data=None,user_id='72f24e9b0fa74da299d3bfff79a1fd92',uuid=0e7a5bec-7f70-404a-bb58-d6b499c04bae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.529 186333 DEBUG nova.network.os_vif_util [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converting VIF {"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.530 186333 DEBUG nova.network.os_vif_util [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:25:17 compute-0 nova_compute[186329]: 2025-12-05 06:25:17.530 186333 DEBUG nova.objects.instance [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e7a5bec-7f70-404a-bb58-d6b499c04bae obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.035 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <uuid>0e7a5bec-7f70-404a-bb58-d6b499c04bae</uuid>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <name>instance-00000011</name>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-1820134413</nova:name>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:25:17</nova:creationTime>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:25:18 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:25:18 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:user uuid="72f24e9b0fa74da299d3bfff79a1fd92">tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin</nova:user>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:project uuid="4548dd99e0bd4ca59433132b59d02fcd">tempest-TestExecuteVmWorkloadBalanceStrategy-249593407</nova:project>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         <nova:port uuid="22b7cda4-bb59-4fdf-812a-a071b6a4d13f">
Dec 05 06:25:18 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <system>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="serial">0e7a5bec-7f70-404a-bb58-d6b499c04bae</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="uuid">0e7a5bec-7f70-404a-bb58-d6b499c04bae</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </system>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <os>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </os>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <features>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </features>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.config"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:49:64:e1"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <target dev="tap22b7cda4-bb"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/console.log" append="off"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <video>
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </video>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:25:18 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:25:18 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:25:18 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:25:18 compute-0 nova_compute[186329]: </domain>
Dec 05 06:25:18 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.036 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Preparing to wait for external event network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.036 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.036 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.037 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.037 186333 DEBUG nova.virt.libvirt.vif [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:25:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1820134413',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1820134413',id=17,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4548dd99e0bd4ca59433132b59d02fcd',ramdisk_id='',reservation_id='r-lhdqy2m7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:25:14Z,user_data=None,user_id='72f24e9b0fa74da299d3bfff79a1fd92',uuid=0e7a5bec-7f70-404a-bb58-d6b499c04bae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.037 186333 DEBUG nova.network.os_vif_util [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converting VIF {"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.038 186333 DEBUG nova.network.os_vif_util [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.038 186333 DEBUG os_vif [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.038 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.039 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.039 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.040 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.040 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e84647c3-1e3a-5a73-b031-145e58f26d61', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.040 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.042 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.045 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.045 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22b7cda4-bb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.045 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap22b7cda4-bb, col_values=(('qos', UUID('e6af741a-e8c4-4d67-bf82-4b0edbbf14e7')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.046 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap22b7cda4-bb, col_values=(('external_ids', {'iface-id': '22b7cda4-bb59-4fdf-812a-a071b6a4d13f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:64:e1', 'vm-uuid': '0e7a5bec-7f70-404a-bb58-d6b499c04bae'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.046 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 NetworkManager[55434]: <info>  [1764915918.0474] manager: (tap22b7cda4-bb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.049 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.052 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.052 186333 INFO os_vif [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb')
Dec 05 06:25:18 compute-0 nova_compute[186329]: 2025-12-05 06:25:18.861 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:19 compute-0 nova_compute[186329]: 2025-12-05 06:25:19.577 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:25:19 compute-0 nova_compute[186329]: 2025-12-05 06:25:19.577 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:25:19 compute-0 nova_compute[186329]: 2025-12-05 06:25:19.577 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] No VIF found with MAC fa:16:3e:49:64:e1, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:25:19 compute-0 nova_compute[186329]: 2025-12-05 06:25:19.578 186333 INFO nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Using config drive
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.084 186333 WARNING neutronclient.v2_0.client [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.628 186333 INFO nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Creating config drive at /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.config
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.633 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzs5dmkyv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.752 186333 DEBUG oslo_concurrency.processutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpzs5dmkyv" returned: 0 in 0.119s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:20 compute-0 kernel: tap22b7cda4-bb: entered promiscuous mode
Dec 05 06:25:20 compute-0 NetworkManager[55434]: <info>  [1764915920.7925] manager: (tap22b7cda4-bb): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.791 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:20 compute-0 ovn_controller[95223]: 2025-12-05T06:25:20Z|00147|binding|INFO|Claiming lport 22b7cda4-bb59-4fdf-812a-a071b6a4d13f for this chassis.
Dec 05 06:25:20 compute-0 ovn_controller[95223]: 2025-12-05T06:25:20Z|00148|binding|INFO|22b7cda4-bb59-4fdf-812a-a071b6a4d13f: Claiming fa:16:3e:49:64:e1 10.100.0.9
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.795 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.801 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:64:e1 10.100.0.9'], port_security=['fa:16:3e:49:64:e1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0e7a5bec-7f70-404a-bb58-d6b499c04bae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d117c96-dc46-40e2-bd38-a754a1602f7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359366c3-1c13-443d-9597-209e3af69d9c, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=22b7cda4-bb59-4fdf-812a-a071b6a4d13f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.802 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 22b7cda4-bb59-4fdf-812a-a071b6a4d13f in datapath 22a11164-8d14-45f7-8928-10d564f2f223 bound to our chassis
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.803 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22a11164-8d14-45f7-8928-10d564f2f223
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.811 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff80411-3123-4852-82fb-9266895cd31b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.812 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap22a11164-81 in ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.814 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap22a11164-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.814 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fbcaf8b5-b784-4f95-a429-97266c477308]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.815 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3f415171-4330-4e95-a46c-c99a772b1173]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 systemd-udevd[212230]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.823 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[88ee6be0-806d-4997-a59e-09f0b6108d3c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 NetworkManager[55434]: <info>  [1764915920.8310] device (tap22b7cda4-bb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:25:20 compute-0 NetworkManager[55434]: <info>  [1764915920.8319] device (tap22b7cda4-bb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:25:20 compute-0 systemd-machined[152967]: New machine qemu-13-instance-00000011.
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.840 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3805c2fe-1209-4421-b1f6-fcaf86f49afa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.853 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:20 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Dec 05 06:25:20 compute-0 ovn_controller[95223]: 2025-12-05T06:25:20Z|00149|binding|INFO|Setting lport 22b7cda4-bb59-4fdf-812a-a071b6a4d13f ovn-installed in OVS
Dec 05 06:25:20 compute-0 ovn_controller[95223]: 2025-12-05T06:25:20Z|00150|binding|INFO|Setting lport 22b7cda4-bb59-4fdf-812a-a071b6a4d13f up in Southbound
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.859 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f5411563-5c6e-46dc-8ac7-6b4642ebc97a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 nova_compute[186329]: 2025-12-05 06:25:20.860 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.862 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2381aab7-ad3e-4e67-8408-6199cab80f0c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 NetworkManager[55434]: <info>  [1764915920.8630] manager: (tap22a11164-80): new Veth device (/org/freedesktop/NetworkManager/Devices/68)
Dec 05 06:25:20 compute-0 systemd-udevd[212236]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.887 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[683700bb-6697-4e1d-a620-f53e992bc6ac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.889 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[441ff87b-49c6-4099-98cf-2523e4f5f578]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 NetworkManager[55434]: <info>  [1764915920.9057] device (tap22a11164-80): carrier: link connected
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.910 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[a7637d8f-a8ca-4f1a-a7cc-86fc938e6c10]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.924 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b8bb064c-d627-4859-97a1-f63bba93b5b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22a11164-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cb:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379944, 'reachable_time': 22713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212258, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.935 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c7f57801-e10d-4ee1-9ed4-92754151c093]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:cbf5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379944, 'tstamp': 379944}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212259, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.946 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ac065117-27f8-40bb-a8ea-7963abfb5416]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22a11164-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cb:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379944, 'reachable_time': 22713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212260, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:20 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:20.965 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[508c7198-5f0f-480d-99fe-d2816802ab72]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.005 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[be6c8b38-6626-4596-a183-c188f8025084]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.006 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22a11164-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.006 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.006 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22a11164-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:21 compute-0 NetworkManager[55434]: <info>  [1764915921.0084] manager: (tap22a11164-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Dec 05 06:25:21 compute-0 kernel: tap22a11164-80: entered promiscuous mode
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.009 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.012 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22a11164-80, col_values=(('external_ids', {'iface-id': '5199b61b-18e1-4d88-bc76-49efa68df776'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.012 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:21 compute-0 ovn_controller[95223]: 2025-12-05T06:25:21Z|00151|binding|INFO|Releasing lport 5199b61b-18e1-4d88-bc76-49efa68df776 from this chassis (sb_readonly=0)
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.024 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.025 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b942f4-3bde-431c-bfe6-b27f7e09b3ff]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.026 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.026 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.026 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 22a11164-8d14-45f7-8928-10d564f2f223 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.026 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.026 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[66f04f64-5171-4c64-988b-fa988b9e53c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.027 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.027 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[59cdb2a7-05c6-4067-9fde-d4e2d4b17d4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.027 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-22a11164-8d14-45f7-8928-10d564f2f223
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 22a11164-8d14-45f7-8928-10d564f2f223
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:25:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:21.028 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'env', 'PROCESS_TAG=haproxy-22a11164-8d14-45f7-8928-10d564f2f223', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/22a11164-8d14-45f7-8928-10d564f2f223.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:25:21 compute-0 podman[212295]: 2025-12-05 06:25:21.328279638 +0000 UTC m=+0.030008004 container create c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 05 06:25:21 compute-0 systemd[1]: Started libpod-conmon-c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc.scope.
Dec 05 06:25:21 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:25:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3862986a72985d774eebee1c1bfc4271551791a601ea6502a247404a08b14524/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:25:21 compute-0 podman[212295]: 2025-12-05 06:25:21.395324784 +0000 UTC m=+0.097053170 container init c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:25:21 compute-0 podman[212295]: 2025-12-05 06:25:21.399975018 +0000 UTC m=+0.101703394 container start c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:25:21 compute-0 podman[212295]: 2025-12-05 06:25:21.314954869 +0000 UTC m=+0.016683265 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:25:21 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [NOTICE]   (212311) : New worker (212313) forked
Dec 05 06:25:21 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [NOTICE]   (212311) : Loading success.
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.629 186333 DEBUG nova.compute.manager [req-98c68d96-1aa1-4218-b9a5-b37aa37cde1c req-154fd98c-65d2-4f7a-afe2-5d4231c85657 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.631 186333 DEBUG oslo_concurrency.lockutils [req-98c68d96-1aa1-4218-b9a5-b37aa37cde1c req-154fd98c-65d2-4f7a-afe2-5d4231c85657 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.631 186333 DEBUG oslo_concurrency.lockutils [req-98c68d96-1aa1-4218-b9a5-b37aa37cde1c req-154fd98c-65d2-4f7a-afe2-5d4231c85657 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.631 186333 DEBUG oslo_concurrency.lockutils [req-98c68d96-1aa1-4218-b9a5-b37aa37cde1c req-154fd98c-65d2-4f7a-afe2-5d4231c85657 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.631 186333 DEBUG nova.compute.manager [req-98c68d96-1aa1-4218-b9a5-b37aa37cde1c req-154fd98c-65d2-4f7a-afe2-5d4231c85657 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Processing event network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.632 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.636 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.638 186333 INFO nova.virt.libvirt.driver [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance spawned successfully.
Dec 05 06:25:21 compute-0 nova_compute[186329]: 2025-12-05 06:25:21.639 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.149 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.149 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.150 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.150 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.150 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.150 186333 DEBUG nova.virt.libvirt.driver [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.657 186333 INFO nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Took 7.58 seconds to spawn the instance on the hypervisor.
Dec 05 06:25:22 compute-0 nova_compute[186329]: 2025-12-05 06:25:22.657 186333 DEBUG nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.049 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.179 186333 INFO nova.compute.manager [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Took 12.72 seconds to build instance.
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.680 186333 DEBUG nova.compute.manager [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.681 186333 DEBUG oslo_concurrency.lockutils [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.682 186333 DEBUG oslo_concurrency.lockutils [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.683 186333 DEBUG oslo_concurrency.lockutils [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.683 186333 DEBUG nova.compute.manager [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] No waiting events found dispatching network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.683 186333 WARNING nova.compute.manager [req-d4478308-7bb5-41a5-a050-cef9802f5154 req-f9045dd9-6ac0-4933-bc5b-9fa48aac1a38 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received unexpected event network-vif-plugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f for instance with vm_state active and task_state None.
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.684 186333 DEBUG oslo_concurrency.lockutils [None req-8c69c5cc-b402-4e40-92b6-5bda57ddfd4f 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.230s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:23 compute-0 nova_compute[186329]: 2025-12-05 06:25:23.862 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:28 compute-0 nova_compute[186329]: 2025-12-05 06:25:28.051 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:28 compute-0 podman[212319]: 2025-12-05 06:25:28.466677319 +0000 UTC m=+0.043650471 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:25:28 compute-0 podman[212318]: 2025-12-05 06:25:28.487022731 +0000 UTC m=+0.068772033 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 06:25:28 compute-0 nova_compute[186329]: 2025-12-05 06:25:28.864 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:29.510 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:29.510 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:29.511 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:29 compute-0 podman[196599]: time="2025-12-05T06:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:25:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18588 "" "Go-http-client/1.1"
Dec 05 06:25:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3043 "" "Go-http-client/1.1"
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: ERROR   06:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: ERROR   06:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: ERROR   06:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: ERROR   06:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: ERROR   06:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:25:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:25:33 compute-0 nova_compute[186329]: 2025-12-05 06:25:33.053 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:33 compute-0 ovn_controller[95223]: 2025-12-05T06:25:33Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:64:e1 10.100.0.9
Dec 05 06:25:33 compute-0 ovn_controller[95223]: 2025-12-05T06:25:33Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:64:e1 10.100.0.9
Dec 05 06:25:33 compute-0 nova_compute[186329]: 2025-12-05 06:25:33.866 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:38 compute-0 nova_compute[186329]: 2025-12-05 06:25:38.055 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:38 compute-0 nova_compute[186329]: 2025-12-05 06:25:38.866 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:39 compute-0 podman[212373]: 2025-12-05 06:25:39.469526108 +0000 UTC m=+0.049456443 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:25:39 compute-0 podman[212372]: 2025-12-05 06:25:39.469764235 +0000 UTC m=+0.051675154 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 05 06:25:39 compute-0 podman[212371]: 2025-12-05 06:25:39.482873162 +0000 UTC m=+0.067955134 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 05 06:25:39 compute-0 nova_compute[186329]: 2025-12-05 06:25:39.846 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Creating tmpfile /var/lib/nova/instances/tmpqr5os1v0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:25:39 compute-0 nova_compute[186329]: 2025-12-05 06:25:39.847 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:39 compute-0 nova_compute[186329]: 2025-12-05 06:25:39.854 186333 DEBUG nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqr5os1v0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:25:41 compute-0 nova_compute[186329]: 2025-12-05 06:25:41.880 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:43 compute-0 nova_compute[186329]: 2025-12-05 06:25:43.056 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:43 compute-0 nova_compute[186329]: 2025-12-05 06:25:43.867 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:45 compute-0 nova_compute[186329]: 2025-12-05 06:25:45.677 186333 DEBUG nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqr5os1v0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dceafbc9-e9dc-4fdc-9808-7a98ce863957',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:25:46 compute-0 nova_compute[186329]: 2025-12-05 06:25:46.686 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:25:46 compute-0 nova_compute[186329]: 2025-12-05 06:25:46.686 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:25:46 compute-0 nova_compute[186329]: 2025-12-05 06:25:46.686 186333 DEBUG nova.network.neutron [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:25:47 compute-0 nova_compute[186329]: 2025-12-05 06:25:47.192 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:47 compute-0 nova_compute[186329]: 2025-12-05 06:25:47.450 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:47 compute-0 nova_compute[186329]: 2025-12-05 06:25:47.548 186333 DEBUG nova.network.neutron [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Updating instance_info_cache with network_info: [{"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.054 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.059 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.061 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqr5os1v0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dceafbc9-e9dc-4fdc-9808-7a98ce863957',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.062 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Creating instance directory: /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.062 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Creating disk.info with the contents: {'/var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk': 'qcow2', '/var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.062 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.063 186333 DEBUG nova.objects.instance [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid dceafbc9-e9dc-4fdc-9808-7a98ce863957 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.567 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.570 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.571 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.615 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.616 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.617 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.617 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.620 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.620 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.662 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.662 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.684 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk 1073741824" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.684 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.068s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.685 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.726 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.727 186333 DEBUG nova.virt.disk.api [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.727 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.769 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.769 186333 DEBUG nova.virt.disk.api [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.770 186333 DEBUG nova.objects.instance [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid dceafbc9-e9dc-4fdc-9808-7a98ce863957 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:25:48 compute-0 nova_compute[186329]: 2025-12-05 06:25:48.869 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.275 186333 DEBUG nova.objects.base [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<dceafbc9-e9dc-4fdc-9808-7a98ce863957> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.276 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.293 186333 DEBUG oslo_concurrency.processutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk.config 497664" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.293 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.294 186333 DEBUG nova.virt.libvirt.vif [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:24:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-144014883',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-144014883',id=16,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:25:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4548dd99e0bd4ca59433132b59d02fcd',ramdisk_id='',reservation_id='r-jw9bqagv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:25:05Z,user_data=None,user_id='72f24e9b0fa74da299d3bfff79a1fd92',uuid=dceafbc9-e9dc-4fdc-9808-7a98ce863957,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.295 186333 DEBUG nova.network.os_vif_util [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.295 186333 DEBUG nova.network.os_vif_util [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.296 186333 DEBUG os_vif [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.296 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.297 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.297 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.298 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.298 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b64a6d5e-aece-5c22-949f-76597ae745f1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.301 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.303 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.303 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01674f7c-8e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.303 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap01674f7c-8e, col_values=(('qos', UUID('a72fc3a6-9e14-4de7-bdae-6c5cc8479514')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.304 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap01674f7c-8e, col_values=(('external_ids', {'iface-id': '01674f7c-8e94-4be6-8bf0-a6d112d7fc2a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1f:da:87', 'vm-uuid': 'dceafbc9-e9dc-4fdc-9808-7a98ce863957'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:49 compute-0 NetworkManager[55434]: <info>  [1764915949.3054] manager: (tap01674f7c-8e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.307 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.309 186333 INFO os_vif [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e')
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.310 186333 DEBUG nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.310 186333 DEBUG nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqr5os1v0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dceafbc9-e9dc-4fdc-9808-7a98ce863957',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.311 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.564 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:49 compute-0 nova_compute[186329]: 2025-12-05 06:25:49.757 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:49 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:49.758 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:25:49 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:49.759 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:25:50 compute-0 nova_compute[186329]: 2025-12-05 06:25:50.021 186333 DEBUG nova.network.neutron [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Port 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:25:50 compute-0 nova_compute[186329]: 2025-12-05 06:25:50.028 186333 DEBUG nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqr5os1v0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dceafbc9-e9dc-4fdc-9808-7a98ce863957',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:25:51 compute-0 ovn_controller[95223]: 2025-12-05T06:25:51Z|00152|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 06:25:53 compute-0 NetworkManager[55434]: <info>  [1764915953.5172] manager: (tap01674f7c-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec 05 06:25:53 compute-0 kernel: tap01674f7c-8e: entered promiscuous mode
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.522 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:53 compute-0 ovn_controller[95223]: 2025-12-05T06:25:53Z|00153|binding|INFO|Claiming lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a for this additional chassis.
Dec 05 06:25:53 compute-0 ovn_controller[95223]: 2025-12-05T06:25:53Z|00154|binding|INFO|01674f7c-8e94-4be6-8bf0-a6d112d7fc2a: Claiming fa:16:3e:1f:da:87 10.100.0.6
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.535 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:da:87 10.100.0.6'], port_security=['fa:16:3e:1f:da:87 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dceafbc9-e9dc-4fdc-9808-7a98ce863957', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8d117c96-dc46-40e2-bd38-a754a1602f7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359366c3-1c13-443d-9597-209e3af69d9c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.535 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a in datapath 22a11164-8d14-45f7-8928-10d564f2f223 unbound from our chassis
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.536 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22a11164-8d14-45f7-8928-10d564f2f223
Dec 05 06:25:53 compute-0 ovn_controller[95223]: 2025-12-05T06:25:53Z|00155|binding|INFO|Setting lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a ovn-installed in OVS
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.539 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.540 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.542 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:53 compute-0 systemd-udevd[212459]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.548 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2510df4b-7e2e-4ef5-8ecf-08521ec5b55e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 systemd-machined[152967]: New machine qemu-14-instance-00000010.
Dec 05 06:25:53 compute-0 NetworkManager[55434]: <info>  [1764915953.5605] device (tap01674f7c-8e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:25:53 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000010.
Dec 05 06:25:53 compute-0 NetworkManager[55434]: <info>  [1764915953.5626] device (tap01674f7c-8e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.570 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[56672ae0-945a-4125-bc3e-118aa89e3f6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.571 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f186f9dd-48c4-4239-b86a-9ebc57264b8d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.590 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[052346a4-726c-4043-998f-561babd09d89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.601 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[45a85d21-d2a3-477d-8dde-cfa8855718ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22a11164-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cb:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379944, 'reachable_time': 23588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212470, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.612 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[28facb35-6b1a-4a99-976c-120d1bbcbc4a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22a11164-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379951, 'tstamp': 379951}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212473, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22a11164-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379953, 'tstamp': 379953}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212473, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.613 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22a11164-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.614 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.615 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22a11164-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.615 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.616 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22a11164-80, col_values=(('external_ids', {'iface-id': '5199b61b-18e1-4d88-bc76-49efa68df776'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.616 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:25:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:53.617 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a63befbb-693e-4108-9637-b9043b345cf1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22a11164-8d14-45f7-8928-10d564f2f223\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22a11164-8d14-45f7-8928-10d564f2f223\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:25:53 compute-0 nova_compute[186329]: 2025-12-05 06:25:53.870 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:54 compute-0 nova_compute[186329]: 2025-12-05 06:25:54.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:54 compute-0 nova_compute[186329]: 2025-12-05 06:25:54.305 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:54 compute-0 nova_compute[186329]: 2025-12-05 06:25:54.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:54 compute-0 nova_compute[186329]: 2025-12-05 06:25:54.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:25:55 compute-0 nova_compute[186329]: 2025-12-05 06:25:55.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:55 compute-0 nova_compute[186329]: 2025-12-05 06:25:55.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:55 compute-0 nova_compute[186329]: 2025-12-05 06:25:55.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:25:55 compute-0 nova_compute[186329]: 2025-12-05 06:25:55.217 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:25:55 compute-0 ovn_controller[95223]: 2025-12-05T06:25:55Z|00156|binding|INFO|Claiming lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a for this chassis.
Dec 05 06:25:55 compute-0 ovn_controller[95223]: 2025-12-05T06:25:55Z|00157|binding|INFO|01674f7c-8e94-4be6-8bf0-a6d112d7fc2a: Claiming fa:16:3e:1f:da:87 10.100.0.6
Dec 05 06:25:55 compute-0 ovn_controller[95223]: 2025-12-05T06:25:55Z|00158|binding|INFO|Setting lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a up in Southbound
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.248 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.292 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.293 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.334 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.338 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.392 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.392 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.434 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.438 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.482 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.482 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.524 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.735 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.736 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.751 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.752 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5427MB free_disk=73.07980728149414GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.752 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.752 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.898 186333 INFO nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Post operation of migration started
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.899 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.988 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:56 compute-0 nova_compute[186329]: 2025-12-05 06:25:56.988 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:57 compute-0 nova_compute[186329]: 2025-12-05 06:25:57.039 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:25:57 compute-0 nova_compute[186329]: 2025-12-05 06:25:57.039 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:25:57 compute-0 nova_compute[186329]: 2025-12-05 06:25:57.039 186333 DEBUG nova.network.neutron [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:25:57 compute-0 nova_compute[186329]: 2025-12-05 06:25:57.543 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:25:57 compute-0 nova_compute[186329]: 2025-12-05 06:25:57.764 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance dceafbc9-e9dc-4fdc-9808-7a98ce863957 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:25:58 compute-0 nova_compute[186329]: 2025-12-05 06:25:58.269 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Updating resource usage from migration e458bb2a-1c8e-40f5-a5ea-f5c176f0738b
Dec 05 06:25:58 compute-0 nova_compute[186329]: 2025-12-05 06:25:58.269 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Starting to track incoming migration e458bb2a-1c8e-40f5-a5ea-f5c176f0738b with flavor cb13e320-971c-46c2-a935-d695f3631bf8 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:25:58 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:25:58.760 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:25:58 compute-0 nova_compute[186329]: 2025-12-05 06:25:58.871 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.307 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.323 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.323 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 0e7a5bec-7f70-404a-bb58-d6b499c04bae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:25:59 compute-0 podman[212513]: 2025-12-05 06:25:59.456862668 +0000 UTC m=+0.039633946 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:25:59 compute-0 podman[212512]: 2025-12-05 06:25:59.480065011 +0000 UTC m=+0.064902674 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:25:59 compute-0 podman[196599]: time="2025-12-05T06:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:25:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18588 "" "Go-http-client/1.1"
Dec 05 06:25:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.827 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance dceafbc9-e9dc-4fdc-9808-7a98ce863957 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.827 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.828 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:25:56 up  1:03,  0 user,  load average: 0.25, 0.22, 0.27\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_4548dd99e0bd4ca59433132b59d02fcd': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:25:59 compute-0 nova_compute[186329]: 2025-12-05 06:25:59.925 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:26:00 compute-0 nova_compute[186329]: 2025-12-05 06:26:00.430 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:26:00 compute-0 nova_compute[186329]: 2025-12-05 06:26:00.937 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:26:00 compute-0 nova_compute[186329]: 2025-12-05 06:26:00.937 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.185s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: ERROR   06:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: ERROR   06:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: ERROR   06:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: ERROR   06:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: ERROR   06:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:26:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:26:01 compute-0 nova_compute[186329]: 2025-12-05 06:26:01.553 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:01 compute-0 nova_compute[186329]: 2025-12-05 06:26:01.669 186333 DEBUG nova.network.neutron [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Updating instance_info_cache with network_info: [{"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.173 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-dceafbc9-e9dc-4fdc-9808-7a98ce863957" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.685 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.686 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.687 186333 DEBUG oslo_concurrency.lockutils [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.689 186333 INFO nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:26:02 compute-0 virtqemud[186605]: Domain id=14 name='instance-00000010' uuid=dceafbc9-e9dc-4fdc-9808-7a98ce863957 is tainted: custom-monitor
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.938 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.938 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.938 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.938 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.939 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:02 compute-0 nova_compute[186329]: 2025-12-05 06:26:02.939 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:26:03 compute-0 nova_compute[186329]: 2025-12-05 06:26:03.693 186333 INFO nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:26:03 compute-0 nova_compute[186329]: 2025-12-05 06:26:03.872 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:04 compute-0 nova_compute[186329]: 2025-12-05 06:26:04.309 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:04 compute-0 nova_compute[186329]: 2025-12-05 06:26:04.698 186333 INFO nova.virt.libvirt.driver [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:26:04 compute-0 nova_compute[186329]: 2025-12-05 06:26:04.701 186333 DEBUG nova.compute.manager [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:26:05 compute-0 nova_compute[186329]: 2025-12-05 06:26:05.206 186333 DEBUG nova.objects.instance [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:26:06 compute-0 nova_compute[186329]: 2025-12-05 06:26:06.217 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:06 compute-0 nova_compute[186329]: 2025-12-05 06:26:06.562 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:06 compute-0 nova_compute[186329]: 2025-12-05 06:26:06.563 186333 WARNING neutronclient.v2_0.client [None req-7d8d882b-d5c5-41c9-81e3-04a966bcd2d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:08 compute-0 nova_compute[186329]: 2025-12-05 06:26:08.873 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.287 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.287 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.288 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.288 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.288 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.296 186333 INFO nova.compute.manager [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Terminating instance
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.310 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.805 186333 DEBUG nova.compute.manager [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:26:09 compute-0 kernel: tap22b7cda4-bb (unregistering): left promiscuous mode
Dec 05 06:26:09 compute-0 NetworkManager[55434]: <info>  [1764915969.8322] device (tap22b7cda4-bb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.835 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 ovn_controller[95223]: 2025-12-05T06:26:09Z|00159|binding|INFO|Releasing lport 22b7cda4-bb59-4fdf-812a-a071b6a4d13f from this chassis (sb_readonly=0)
Dec 05 06:26:09 compute-0 ovn_controller[95223]: 2025-12-05T06:26:09Z|00160|binding|INFO|Setting lport 22b7cda4-bb59-4fdf-812a-a071b6a4d13f down in Southbound
Dec 05 06:26:09 compute-0 ovn_controller[95223]: 2025-12-05T06:26:09Z|00161|binding|INFO|Removing iface tap22b7cda4-bb ovn-installed in OVS
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.837 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.840 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:64:e1 10.100.0.9'], port_security=['fa:16:3e:49:64:e1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '0e7a5bec-7f70-404a-bb58-d6b499c04bae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8d117c96-dc46-40e2-bd38-a754a1602f7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359366c3-1c13-443d-9597-209e3af69d9c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=22b7cda4-bb59-4fdf-812a-a071b6a4d13f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.841 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 22b7cda4-bb59-4fdf-812a-a071b6a4d13f in datapath 22a11164-8d14-45f7-8928-10d564f2f223 unbound from our chassis
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.841 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 22a11164-8d14-45f7-8928-10d564f2f223
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.852 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.858 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[8edbecc3-c97b-4abb-8a08-6c4d26b879ef]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Dec 05 06:26:09 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 11.966s CPU time.
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.895 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbe4efa-949c-49e9-890c-0d1892809b7a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 systemd-machined[152967]: Machine qemu-13-instance-00000011 terminated.
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.899 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[a06f9f0a-c907-4627-a85f-e03136f88001]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 podman[212566]: 2025-12-05 06:26:09.921539781 +0000 UTC m=+0.057047094 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.923 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f4bd2afe-0462-4276-b475-4e18fcce9b4c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 podman[212562]: 2025-12-05 06:26:09.929634128 +0000 UTC m=+0.072528721 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:26:09 compute-0 podman[212564]: 2025-12-05 06:26:09.941629931 +0000 UTC m=+0.070178743 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.942 186333 DEBUG nova.compute.manager [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.943 186333 DEBUG oslo_concurrency.lockutils [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.943 186333 DEBUG oslo_concurrency.lockutils [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.943 186333 DEBUG oslo_concurrency.lockutils [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.943 186333 DEBUG nova.compute.manager [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] No waiting events found dispatching network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.943 186333 DEBUG nova.compute.manager [req-80bda2e1-a608-4d88-a267-f5ef4d00e5a2 req-eb7346cd-0479-4513-bd4c-49d116997c41 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.944 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f65c4631-2a50-4a86-bd16-c216ef01421c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap22a11164-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:cb:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379944, 'reachable_time': 23588, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212622, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.957 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c1c843-e209-44ea-8ba8-ffba52aa7295]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap22a11164-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379951, 'tstamp': 379951}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212625, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap22a11164-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 379953, 'tstamp': 379953}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212625, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.958 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22a11164-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.959 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 nova_compute[186329]: 2025-12-05 06:26:09.962 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.962 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap22a11164-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.963 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.963 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap22a11164-80, col_values=(('external_ids', {'iface-id': '5199b61b-18e1-4d88-bc76-49efa68df776'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.964 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:26:09 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:09.964 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[40d4b43e-564e-4a19-931a-10c4998626c9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-22a11164-8d14-45f7-8928-10d564f2f223\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 22a11164-8d14-45f7-8928-10d564f2f223\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.045 186333 INFO nova.virt.libvirt.driver [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Instance destroyed successfully.
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.046 186333 DEBUG nova.objects.instance [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lazy-loading 'resources' on Instance uuid 0e7a5bec-7f70-404a-bb58-d6b499c04bae obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.550 186333 DEBUG nova.virt.libvirt.vif [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:25:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1820134413',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1820134413',id=17,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:25:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4548dd99e0bd4ca59433132b59d02fcd',ramdisk_id='',reservation_id='r-lhdqy2m7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:25:22Z,user_data=None,user_id='72f24e9b0fa74da299d3bfff79a1fd92',uuid=0e7a5bec-7f70-404a-bb58-d6b499c04bae,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.550 186333 DEBUG nova.network.os_vif_util [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converting VIF {"id": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "address": "fa:16:3e:49:64:e1", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap22b7cda4-bb", "ovs_interfaceid": "22b7cda4-bb59-4fdf-812a-a071b6a4d13f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.551 186333 DEBUG nova.network.os_vif_util [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.551 186333 DEBUG os_vif [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.552 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.552 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22b7cda4-bb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.553 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.555 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.555 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.555 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e6af741a-e8c4-4d67-bf82-4b0edbbf14e7) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.556 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.558 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.560 186333 INFO os_vif [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:64:e1,bridge_name='br-int',has_traffic_filtering=True,id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap22b7cda4-bb')
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.560 186333 INFO nova.virt.libvirt.driver [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Deleting instance files /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae_del
Dec 05 06:26:10 compute-0 nova_compute[186329]: 2025-12-05 06:26:10.561 186333 INFO nova.virt.libvirt.driver [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Deletion of /var/lib/nova/instances/0e7a5bec-7f70-404a-bb58-d6b499c04bae_del complete
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.069 186333 INFO nova.compute.manager [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.069 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.069 186333 DEBUG nova.compute.manager [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.069 186333 DEBUG nova.network.neutron [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.069 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.562 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG oslo_concurrency.lockutils [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG oslo_concurrency.lockutils [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG oslo_concurrency.lockutils [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] No waiting events found dispatching network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.987 186333 DEBUG nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-unplugged-22b7cda4-bb59-4fdf-812a-a071b6a4d13f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.988 186333 DEBUG nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Received event network-vif-deleted-22b7cda4-bb59-4fdf-812a-a071b6a4d13f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.988 186333 INFO nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Neutron deleted interface 22b7cda4-bb59-4fdf-812a-a071b6a4d13f; detaching it from the instance and deleting it from the info cache
Dec 05 06:26:11 compute-0 nova_compute[186329]: 2025-12-05 06:26:11.988 186333 DEBUG nova.network.neutron [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:26:12 compute-0 nova_compute[186329]: 2025-12-05 06:26:12.243 186333 DEBUG nova.network.neutron [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:26:12 compute-0 nova_compute[186329]: 2025-12-05 06:26:12.492 186333 DEBUG nova.compute.manager [req-5e92ceba-32bf-492c-938e-8666e17ebeed req-9995816c-07b9-4d8e-bc8a-fad66e884e7c fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Detach interface failed, port_id=22b7cda4-bb59-4fdf-812a-a071b6a4d13f, reason: Instance 0e7a5bec-7f70-404a-bb58-d6b499c04bae could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:26:12 compute-0 nova_compute[186329]: 2025-12-05 06:26:12.748 186333 INFO nova.compute.manager [-] [instance: 0e7a5bec-7f70-404a-bb58-d6b499c04bae] Took 1.68 seconds to deallocate network for instance.
Dec 05 06:26:13 compute-0 nova_compute[186329]: 2025-12-05 06:26:13.259 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:13 compute-0 nova_compute[186329]: 2025-12-05 06:26:13.259 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:13 compute-0 nova_compute[186329]: 2025-12-05 06:26:13.318 186333 DEBUG nova.compute.provider_tree [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:26:13 compute-0 nova_compute[186329]: 2025-12-05 06:26:13.822 186333 DEBUG nova.scheduler.client.report [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:26:13 compute-0 nova_compute[186329]: 2025-12-05 06:26:13.875 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:14 compute-0 nova_compute[186329]: 2025-12-05 06:26:14.328 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:14 compute-0 nova_compute[186329]: 2025-12-05 06:26:14.347 186333 INFO nova.scheduler.client.report [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Deleted allocations for instance 0e7a5bec-7f70-404a-bb58-d6b499c04bae
Dec 05 06:26:15 compute-0 nova_compute[186329]: 2025-12-05 06:26:15.368 186333 DEBUG oslo_concurrency.lockutils [None req-7c1b7352-bdba-4556-81aa-e98d42f7a442 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "0e7a5bec-7f70-404a-bb58-d6b499c04bae" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:15 compute-0 nova_compute[186329]: 2025-12-05 06:26:15.557 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.605 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.606 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.606 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.606 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.606 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:16 compute-0 nova_compute[186329]: 2025-12-05 06:26:16.614 186333 INFO nova.compute.manager [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Terminating instance
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.123 186333 DEBUG nova.compute.manager [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:26:17 compute-0 kernel: tap01674f7c-8e (unregistering): left promiscuous mode
Dec 05 06:26:17 compute-0 NetworkManager[55434]: <info>  [1764915977.1531] device (tap01674f7c-8e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.157 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 ovn_controller[95223]: 2025-12-05T06:26:17Z|00162|binding|INFO|Releasing lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a from this chassis (sb_readonly=0)
Dec 05 06:26:17 compute-0 ovn_controller[95223]: 2025-12-05T06:26:17Z|00163|binding|INFO|Setting lport 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a down in Southbound
Dec 05 06:26:17 compute-0 ovn_controller[95223]: 2025-12-05T06:26:17Z|00164|binding|INFO|Removing iface tap01674f7c-8e ovn-installed in OVS
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.160 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.163 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:da:87 10.100.0.6'], port_security=['fa:16:3e:1f:da:87 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'dceafbc9-e9dc-4fdc-9808-7a98ce863957', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-22a11164-8d14-45f7-8928-10d564f2f223', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4548dd99e0bd4ca59433132b59d02fcd', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8d117c96-dc46-40e2-bd38-a754a1602f7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=359366c3-1c13-443d-9597-209e3af69d9c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.164 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 01674f7c-8e94-4be6-8bf0-a6d112d7fc2a in datapath 22a11164-8d14-45f7-8928-10d564f2f223 unbound from our chassis
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.165 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 22a11164-8d14-45f7-8928-10d564f2f223, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.166 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff5745e-ec36-411b-8bbb-adae86b7fc32]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.167 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223 namespace which is not needed anymore
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.172 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec 05 06:26:17 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000010.scope: Consumed 1.853s CPU time.
Dec 05 06:26:17 compute-0 systemd-machined[152967]: Machine qemu-14-instance-00000010 terminated.
Dec 05 06:26:17 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [NOTICE]   (212311) : haproxy version is 3.0.5-8e879a5
Dec 05 06:26:17 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [NOTICE]   (212311) : path to executable is /usr/sbin/haproxy
Dec 05 06:26:17 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [WARNING]  (212311) : Exiting Master process...
Dec 05 06:26:17 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [ALERT]    (212311) : Current worker (212313) exited with code 143 (Terminated)
Dec 05 06:26:17 compute-0 neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223[212307]: [WARNING]  (212311) : All workers exited. Exiting... (0)
Dec 05 06:26:17 compute-0 podman[212664]: 2025-12-05 06:26:17.250054426 +0000 UTC m=+0.021487991 container kill c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:26:17 compute-0 systemd[1]: libpod-c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc.scope: Deactivated successfully.
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.269 186333 DEBUG nova.compute.manager [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Received event network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.270 186333 DEBUG oslo_concurrency.lockutils [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.270 186333 DEBUG oslo_concurrency.lockutils [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.270 186333 DEBUG oslo_concurrency.lockutils [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.271 186333 DEBUG nova.compute.manager [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] No waiting events found dispatching network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.271 186333 DEBUG nova.compute.manager [req-e28f5c82-5a90-4b9e-98cf-6c888fe376a1 req-7fbf0078-a9f7-4dd9-99e9-6fd5ca80fb08 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Received event network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:26:17 compute-0 podman[212676]: 2025-12-05 06:26:17.284941286 +0000 UTC m=+0.020560376 container died c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:26:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc-userdata-shm.mount: Deactivated successfully.
Dec 05 06:26:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-3862986a72985d774eebee1c1bfc4271551791a601ea6502a247404a08b14524-merged.mount: Deactivated successfully.
Dec 05 06:26:17 compute-0 podman[212676]: 2025-12-05 06:26:17.303751239 +0000 UTC m=+0.039370309 container cleanup c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:26:17 compute-0 systemd[1]: libpod-conmon-c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc.scope: Deactivated successfully.
Dec 05 06:26:17 compute-0 podman[212677]: 2025-12-05 06:26:17.311265167 +0000 UTC m=+0.044196354 container remove c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.314 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d0e9a7d2-045b-482f-b7bc-3e63c4719f2f]: (4, ("Fri Dec  5 06:26:17 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223 (c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc)\nc19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc\nFri Dec  5 06:26:17 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223 (c19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc)\nc19e6143c45d204daf51c60d4f258cd12e5d943890da6b99cd968ef0689d9bdc\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.315 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e97e0799-3f8a-49e0-b2e7-2397939f8db9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.315 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/22a11164-8d14-45f7-8928-10d564f2f223.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.316 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a0c66620-7acf-4885-b23c-b05ab70d3da0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.316 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap22a11164-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.318 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 kernel: tap22a11164-80: left promiscuous mode
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.331 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.335 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[092f5e3c-303d-4906-a5ad-58b589fea4bb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 NetworkManager[55434]: <info>  [1764915977.3361] manager: (tap01674f7c-8e): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.346 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b917c9c7-4496-48c4-86f9-db0a0e932c6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.346 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfd6eaf-b055-4f02-9ad6-2872e8420a7c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.358 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3b258ab8-9c18-42a5-bc24-8af2edcffd1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 379939, 'reachable_time': 24910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212714, 'error': None, 'target': 'ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.359 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-22a11164-8d14-45f7-8928-10d564f2f223 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:26:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:17.360 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[ff97a3a8-5761-495f-955d-f1dbac764779]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d22a11164\x2d8d14\x2d45f7\x2d8928\x2d10d564f2f223.mount: Deactivated successfully.
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.362 186333 INFO nova.virt.libvirt.driver [-] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Instance destroyed successfully.
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.362 186333 DEBUG nova.objects.instance [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lazy-loading 'resources' on Instance uuid dceafbc9-e9dc-4fdc-9808-7a98ce863957 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.866 186333 DEBUG nova.virt.libvirt.vif [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:24:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-144014883',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-144014883',id=16,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:25:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4548dd99e0bd4ca59433132b59d02fcd',ramdisk_id='',reservation_id='r-jw9bqagv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-249593407-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:26:05Z,user_data=None,user_id='72f24e9b0fa74da299d3bfff79a1fd92',uuid=dceafbc9-e9dc-4fdc-9808-7a98ce863957,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.866 186333 DEBUG nova.network.os_vif_util [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converting VIF {"id": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "address": "fa:16:3e:1f:da:87", "network": {"id": "22a11164-8d14-45f7-8928-10d564f2f223", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-651760568-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "698ff04a95fc4ac6be1eb0e1dbe01302", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap01674f7c-8e", "ovs_interfaceid": "01674f7c-8e94-4be6-8bf0-a6d112d7fc2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.867 186333 DEBUG nova.network.os_vif_util [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.867 186333 DEBUG os_vif [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.868 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.868 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01674f7c-8e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.870 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.871 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.871 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a72fc3a6-9e14-4de7-bdae-6c5cc8479514) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.872 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.872 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.874 186333 INFO os_vif [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1f:da:87,bridge_name='br-int',has_traffic_filtering=True,id=01674f7c-8e94-4be6-8bf0-a6d112d7fc2a,network=Network(22a11164-8d14-45f7-8928-10d564f2f223),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap01674f7c-8e')
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.874 186333 INFO nova.virt.libvirt.driver [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Deleting instance files /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957_del
Dec 05 06:26:17 compute-0 nova_compute[186329]: 2025-12-05 06:26:17.874 186333 INFO nova.virt.libvirt.driver [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Deletion of /var/lib/nova/instances/dceafbc9-e9dc-4fdc-9808-7a98ce863957_del complete
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.382 186333 INFO nova.compute.manager [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.382 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.383 186333 DEBUG nova.compute.manager [-] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.383 186333 DEBUG nova.network.neutron [-] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.383 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.562 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:26:18 compute-0 nova_compute[186329]: 2025-12-05 06:26:18.875 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.242 186333 DEBUG nova.network.neutron [-] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.309 186333 DEBUG nova.compute.manager [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Received event network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG oslo_concurrency.lockutils [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG oslo_concurrency.lockutils [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG oslo_concurrency.lockutils [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG nova.compute.manager [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] No waiting events found dispatching network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG nova.compute.manager [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Received event network-vif-unplugged-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.310 186333 DEBUG nova.compute.manager [req-7f356bc7-29fa-4d27-a778-7cd65444db18 req-10460a0f-6ae8-4103-9e34-5d52aa858dc9 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Received event network-vif-deleted-01674f7c-8e94-4be6-8bf0-a6d112d7fc2a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:26:19 compute-0 nova_compute[186329]: 2025-12-05 06:26:19.747 186333 INFO nova.compute.manager [-] [instance: dceafbc9-e9dc-4fdc-9808-7a98ce863957] Took 1.36 seconds to deallocate network for instance.
Dec 05 06:26:20 compute-0 nova_compute[186329]: 2025-12-05 06:26:20.260 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:20 compute-0 nova_compute[186329]: 2025-12-05 06:26:20.261 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:20 compute-0 nova_compute[186329]: 2025-12-05 06:26:20.264 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:20 compute-0 nova_compute[186329]: 2025-12-05 06:26:20.285 186333 INFO nova.scheduler.client.report [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Deleted allocations for instance dceafbc9-e9dc-4fdc-9808-7a98ce863957
Dec 05 06:26:21 compute-0 nova_compute[186329]: 2025-12-05 06:26:21.302 186333 DEBUG oslo_concurrency.lockutils [None req-fddcaf6c-0eee-4df1-877f-aeaa12e5fa87 72f24e9b0fa74da299d3bfff79a1fd92 4548dd99e0bd4ca59433132b59d02fcd - - default default] Lock "dceafbc9-e9dc-4fdc-9808-7a98ce863957" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.696s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:22 compute-0 nova_compute[186329]: 2025-12-05 06:26:22.873 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:23 compute-0 nova_compute[186329]: 2025-12-05 06:26:23.876 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:26 compute-0 nova_compute[186329]: 2025-12-05 06:26:26.011 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:27 compute-0 nova_compute[186329]: 2025-12-05 06:26:27.874 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:28 compute-0 nova_compute[186329]: 2025-12-05 06:26:28.877 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:29.512 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:29.512 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:29.512 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:29 compute-0 podman[196599]: time="2025-12-05T06:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:26:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:26:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec 05 06:26:30 compute-0 podman[212723]: 2025-12-05 06:26:30.462144608 +0000 UTC m=+0.046148373 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:26:30 compute-0 podman[212722]: 2025-12-05 06:26:30.483341319 +0000 UTC m=+0.067641122 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: ERROR   06:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: ERROR   06:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: ERROR   06:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: ERROR   06:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: ERROR   06:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:26:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:26:32 compute-0 nova_compute[186329]: 2025-12-05 06:26:32.875 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:33 compute-0 nova_compute[186329]: 2025-12-05 06:26:33.879 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:35.300 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:f5:eb 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a604f2bba734821844adb92d1c578a7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3a7080-e915-4f20-bdbc-22c4af619c01, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=88888304-ddfb-428f-a56e-30bd63031520) old=Port_Binding(mac=['fa:16:3e:47:f5:eb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2a604f2bba734821844adb92d1c578a7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:26:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:35.301 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 88888304-ddfb-428f-a56e-30bd63031520 in datapath a0d03e43-34ea-4028-a2db-d5149175a508 updated
Dec 05 06:26:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:35.302 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0d03e43-34ea-4028-a2db-d5149175a508, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:26:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:35.303 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0dac20c4-1fe4-42ea-9af3-803e852a2a44]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:37 compute-0 nova_compute[186329]: 2025-12-05 06:26:37.877 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:38 compute-0 nova_compute[186329]: 2025-12-05 06:26:38.880 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:40 compute-0 podman[212769]: 2025-12-05 06:26:40.471333034 +0000 UTC m=+0.044629216 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:26:40 compute-0 podman[212770]: 2025-12-05 06:26:40.475401063 +0000 UTC m=+0.047695652 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Dec 05 06:26:40 compute-0 podman[212771]: 2025-12-05 06:26:40.475391916 +0000 UTC m=+0.045174704 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 05 06:26:42 compute-0 nova_compute[186329]: 2025-12-05 06:26:42.879 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:43 compute-0 nova_compute[186329]: 2025-12-05 06:26:43.882 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:45.375 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:5b:fc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-f90f3247-d2af-4090-b837-722ec19b99c8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90f3247-d2af-4090-b837-722ec19b99c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=38a27943-62a9-4c0b-9dca-949f64df3419, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f92c49d6-4acd-4c0c-8c47-1729d72107b0) old=Port_Binding(mac=['fa:16:3e:44:5b:fc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-f90f3247-d2af-4090-b837-722ec19b99c8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90f3247-d2af-4090-b837-722ec19b99c8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:26:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:45.376 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f92c49d6-4acd-4c0c-8c47-1729d72107b0 in datapath f90f3247-d2af-4090-b837-722ec19b99c8 updated
Dec 05 06:26:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:45.377 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f90f3247-d2af-4090-b837-722ec19b99c8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:26:45 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:45.377 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[21adfdb6-d6ea-4b7e-9aeb-73a2b4e2b2d6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:26:47 compute-0 nova_compute[186329]: 2025-12-05 06:26:47.879 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:48 compute-0 nova_compute[186329]: 2025-12-05 06:26:48.884 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:50.022 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:26:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:50.022 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:26:50 compute-0 nova_compute[186329]: 2025-12-05 06:26:50.022 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:52 compute-0 nova_compute[186329]: 2025-12-05 06:26:52.881 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:53 compute-0 nova_compute[186329]: 2025-12-05 06:26:53.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:53 compute-0 nova_compute[186329]: 2025-12-05 06:26:53.885 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:54 compute-0 nova_compute[186329]: 2025-12-05 06:26:54.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:26:55 compute-0 nova_compute[186329]: 2025-12-05 06:26:55.222 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:55 compute-0 nova_compute[186329]: 2025-12-05 06:26:55.222 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:55 compute-0 nova_compute[186329]: 2025-12-05 06:26:55.222 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:55 compute-0 nova_compute[186329]: 2025-12-05 06:26:55.223 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.249 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.290 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.290 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.329 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.491 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.492 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.504 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.504 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5722MB free_disk=73.1380386352539GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.505 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:26:56 compute-0 nova_compute[186329]: 2025-12-05 06:26:56.505 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:26:57 compute-0 nova_compute[186329]: 2025-12-05 06:26:57.883 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.042 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.042 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.043 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:26:56 up  1:04,  0 user,  load average: 0.13, 0.19, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.069 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.572 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:26:58 compute-0 nova_compute[186329]: 2025-12-05 06:26:58.887 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:26:58 compute-0 ovn_controller[95223]: 2025-12-05T06:26:58Z|00165|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec 05 06:26:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:26:59.023 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:26:59 compute-0 nova_compute[186329]: 2025-12-05 06:26:59.078 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:26:59 compute-0 nova_compute[186329]: 2025-12-05 06:26:59.079 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.574s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:26:59 compute-0 podman[196599]: time="2025-12-05T06:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:26:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:26:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.075 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.075 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.581 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.582 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.582 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.582 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:27:00 compute-0 nova_compute[186329]: 2025-12-05 06:27:00.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: ERROR   06:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: ERROR   06:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: ERROR   06:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: ERROR   06:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: ERROR   06:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:27:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:27:01 compute-0 podman[212830]: 2025-12-05 06:27:01.481337802 +0000 UTC m=+0.049381363 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:27:01 compute-0 podman[212829]: 2025-12-05 06:27:01.500952398 +0000 UTC m=+0.083303487 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:27:01 compute-0 nova_compute[186329]: 2025-12-05 06:27:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:02 compute-0 nova_compute[186329]: 2025-12-05 06:27:02.885 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:03 compute-0 nova_compute[186329]: 2025-12-05 06:27:03.890 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:07 compute-0 nova_compute[186329]: 2025-12-05 06:27:07.887 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:08 compute-0 nova_compute[186329]: 2025-12-05 06:27:08.890 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:11 compute-0 podman[212874]: 2025-12-05 06:27:11.459371537 +0000 UTC m=+0.044397673 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 05 06:27:11 compute-0 podman[212876]: 2025-12-05 06:27:11.491609556 +0000 UTC m=+0.067259334 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:27:11 compute-0 podman[212875]: 2025-12-05 06:27:11.49168532 +0000 UTC m=+0.066327974 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 05 06:27:12 compute-0 nova_compute[186329]: 2025-12-05 06:27:12.889 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:13 compute-0 nova_compute[186329]: 2025-12-05 06:27:13.893 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:17 compute-0 nova_compute[186329]: 2025-12-05 06:27:17.891 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:18 compute-0 nova_compute[186329]: 2025-12-05 06:27:18.895 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:22 compute-0 nova_compute[186329]: 2025-12-05 06:27:22.893 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:23 compute-0 nova_compute[186329]: 2025-12-05 06:27:23.896 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:27 compute-0 nova_compute[186329]: 2025-12-05 06:27:27.894 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:28 compute-0 nova_compute[186329]: 2025-12-05 06:27:28.897 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:29.513 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:27:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:29.513 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:27:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:29.513 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:27:29 compute-0 podman[196599]: time="2025-12-05T06:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:27:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:27:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2594 "" "Go-http-client/1.1"
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: ERROR   06:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: ERROR   06:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: ERROR   06:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: ERROR   06:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: ERROR   06:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:27:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:27:32 compute-0 podman[212928]: 2025-12-05 06:27:32.4553536 +0000 UTC m=+0.037353549 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:27:32 compute-0 podman[212927]: 2025-12-05 06:27:32.480510898 +0000 UTC m=+0.064994135 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:27:32 compute-0 nova_compute[186329]: 2025-12-05 06:27:32.896 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:33 compute-0 nova_compute[186329]: 2025-12-05 06:27:33.898 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:37 compute-0 nova_compute[186329]: 2025-12-05 06:27:37.897 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:38 compute-0 nova_compute[186329]: 2025-12-05 06:27:38.900 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:41 compute-0 nova_compute[186329]: 2025-12-05 06:27:41.927 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Creating tmpfile /var/lib/nova/instances/tmplun_u1h0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:27:41 compute-0 nova_compute[186329]: 2025-12-05 06:27:41.927 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:41 compute-0 nova_compute[186329]: 2025-12-05 06:27:41.929 186333 DEBUG nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplun_u1h0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:27:42 compute-0 podman[212974]: 2025-12-05 06:27:42.469279744 +0000 UTC m=+0.048418698 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 05 06:27:42 compute-0 podman[212973]: 2025-12-05 06:27:42.471880393 +0000 UTC m=+0.053480595 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:27:42 compute-0 podman[212975]: 2025-12-05 06:27:42.473169779 +0000 UTC m=+0.049634485 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:27:42 compute-0 nova_compute[186329]: 2025-12-05 06:27:42.898 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:43 compute-0 nova_compute[186329]: 2025-12-05 06:27:43.901 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:43 compute-0 nova_compute[186329]: 2025-12-05 06:27:43.958 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:47 compute-0 nova_compute[186329]: 2025-12-05 06:27:47.777 186333 DEBUG nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplun_u1h0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbcb12e6-ebf7-49e2-847a-65f1b3a3266c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:27:47 compute-0 nova_compute[186329]: 2025-12-05 06:27:47.900 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:48 compute-0 nova_compute[186329]: 2025-12-05 06:27:48.786 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:27:48 compute-0 nova_compute[186329]: 2025-12-05 06:27:48.787 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:27:48 compute-0 nova_compute[186329]: 2025-12-05 06:27:48.787 186333 DEBUG nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:27:48 compute-0 nova_compute[186329]: 2025-12-05 06:27:48.903 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:49 compute-0 nova_compute[186329]: 2025-12-05 06:27:49.291 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:49 compute-0 nova_compute[186329]: 2025-12-05 06:27:49.785 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:49 compute-0 nova_compute[186329]: 2025-12-05 06:27:49.884 186333 DEBUG nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Updating instance_info_cache with network_info: [{"id": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "address": "fa:16:3e:77:88:29", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6814c1d4-d0", "ovs_interfaceid": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.388 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.396 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplun_u1h0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbcb12e6-ebf7-49e2-847a-65f1b3a3266c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.397 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Creating instance directory: /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.397 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Creating disk.info with the contents: {'/var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk': 'qcow2', '/var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.397 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.398 186333 DEBUG nova.objects.instance [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.902 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.904 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.905 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.947 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.948 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.948 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.949 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.951 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.951 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.989 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:50 compute-0 nova_compute[186329]: 2025-12-05 06:27:50.990 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.007 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.007 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.008 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.048 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.049 186333 DEBUG nova.virt.disk.api [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.049 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.089 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.090 186333 DEBUG nova.virt.disk.api [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.090 186333 DEBUG nova.objects.instance [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.595 186333 DEBUG nova.objects.base [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<bbcb12e6-ebf7-49e2-847a-65f1b3a3266c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.596 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.612 186333 DEBUG oslo_concurrency.processutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.613 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.614 186333 DEBUG nova.virt.libvirt.vif [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:26:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-90104807',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-90104807',id=18,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:27:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d7bd52c6f43e4c0fbd1009b9f0994d4c',ramdisk_id='',reservation_id='r-jbx09typ',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:27:10Z,user_data=None,user_id='f26b0764633e44508f3eff072931d01d',uuid=bbcb12e6-ebf7-49e2-847a-65f1b3a3266c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "address": "fa:16:3e:77:88:29", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6814c1d4-d0", "ovs_interfaceid": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.614 186333 DEBUG nova.network.os_vif_util [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "address": "fa:16:3e:77:88:29", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap6814c1d4-d0", "ovs_interfaceid": "6814c1d4-d0e4-49a9-92b0-3d640e6a64e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.614 186333 DEBUG nova.network.os_vif_util [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:77:88:29,bridge_name='br-int',has_traffic_filtering=True,id=6814c1d4-d0e4-49a9-92b0-3d640e6a64e2,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6814c1d4-d0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.615 186333 DEBUG os_vif [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:88:29,bridge_name='br-int',has_traffic_filtering=True,id=6814c1d4-d0e4-49a9-92b0-3d640e6a64e2,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6814c1d4-d0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.616 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.616 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.617 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.617 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.618 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2e9bff24-73a1-5b22-b0e8-8920225651a3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.619 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.621 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.623 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.623 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6814c1d4-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.624 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6814c1d4-d0, col_values=(('qos', UUID('8bddd837-e642-45fe-b591-b68868249f5c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.624 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6814c1d4-d0, col_values=(('external_ids', {'iface-id': '6814c1d4-d0e4-49a9-92b0-3d640e6a64e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:77:88:29', 'vm-uuid': 'bbcb12e6-ebf7-49e2-847a-65f1b3a3266c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.625 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 NetworkManager[55434]: <info>  [1764916071.6260] manager: (tap6814c1d4-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.628 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.629 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.630 186333 INFO os_vif [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:77:88:29,bridge_name='br-int',has_traffic_filtering=True,id=6814c1d4-d0e4-49a9-92b0-3d640e6a64e2,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6814c1d4-d0')
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.630 186333 DEBUG nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.630 186333 DEBUG nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplun_u1h0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbcb12e6-ebf7-49e2-847a-65f1b3a3266c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.631 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:51 compute-0 nova_compute[186329]: 2025-12-05 06:27:51.688 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:27:52 compute-0 nova_compute[186329]: 2025-12-05 06:27:52.730 186333 DEBUG nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Port 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:27:52 compute-0 nova_compute[186329]: 2025-12-05 06:27:52.737 186333 DEBUG nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplun_u1h0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bbcb12e6-ebf7-49e2-847a-65f1b3a3266c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:27:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:53.732 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:27:53 compute-0 nova_compute[186329]: 2025-12-05 06:27:53.732 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:53.733 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:27:53 compute-0 nova_compute[186329]: 2025-12-05 06:27:53.905 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:54 compute-0 nova_compute[186329]: 2025-12-05 06:27:54.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.222 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.222 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:27:55 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 06:27:55 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 06:27:55 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.5715] manager: (tap6814c1d4-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec 05 06:27:55 compute-0 kernel: tap6814c1d4-d0: entered promiscuous mode
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.573 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 ovn_controller[95223]: 2025-12-05T06:27:55Z|00166|binding|INFO|Claiming lport 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 for this additional chassis.
Dec 05 06:27:55 compute-0 ovn_controller[95223]: 2025-12-05T06:27:55Z|00167|binding|INFO|6814c1d4-d0e4-49a9-92b0-3d640e6a64e2: Claiming fa:16:3e:77:88:29 10.100.0.10
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.578 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.591 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:88:29 10.100.0.10'], port_security=['fa:16:3e:77:88:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bbcb12e6-ebf7-49e2-847a-65f1b3a3266c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f6ae1da8-70b1-4c7e-8cc3-a298827abbb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3a7080-e915-4f20-bdbc-22c4af619c01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=6814c1d4-d0e4-49a9-92b0-3d640e6a64e2) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.592 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 in datapath a0d03e43-34ea-4028-a2db-d5149175a508 unbound from our chassis
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.593 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:27:55 compute-0 systemd-machined[152967]: New machine qemu-15-instance-00000012.
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.602 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[beeec36a-4ced-4f78-b1c5-720d9bd93e2d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.602 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa0d03e43-31 in ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.606 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa0d03e43-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.606 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[20e49c0e-a841-4d8e-b8c7-9d6f2a33a71b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.606 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[301a53a6-8818-43b4-9eb9-055514fa32ee]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.615 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[5f218a59-4b60-4580-b935-0ecab8c19f53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.631 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3104a2-cba0-4196-853d-d5aeca873807]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Dec 05 06:27:55 compute-0 ovn_controller[95223]: 2025-12-05T06:27:55Z|00168|binding|INFO|Setting lport 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 ovn-installed in OVS
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.639 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 systemd-udevd[213084]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.6555] device (tap6814c1d4-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.655 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[20be5e76-26b2-494b-9d61-1647e23a6878]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.6563] device (tap6814c1d4-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.658 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4b25d07d-1c6e-4391-8d01-64b6cfacd5f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.6595] manager: (tapa0d03e43-30): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Dec 05 06:27:55 compute-0 systemd-udevd[213087]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.681 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7859d9-8a1c-48d8-bb0e-dbd2edca2048]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.683 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[3355ef29-0f00-4e6b-82a5-14ce8f72e573]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.7001] device (tapa0d03e43-30): carrier: link connected
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.704 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4ce479-1df6-4bf9-9e72-98e8ed45e44b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.716 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f2eacd1f-a62d-4630-afac-099221d960a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0d03e43-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:f5:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395423, 'reachable_time': 19118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213105, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.728 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[808bdeef-3de6-44ed-bd31-2dd25f6bd3ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:f5eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395423, 'tstamp': 395423}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213106, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.740 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d982a9a2-955b-4959-a91c-7830408ce38b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0d03e43-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:f5:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395423, 'reachable_time': 19118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213107, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.760 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2628ea74-b936-400b-a5af-6dc6e7c36137]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.801 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d4ffe3-9f2b-4696-8df4-6f31ea3b4b8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.802 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d03e43-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.802 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.802 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0d03e43-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:55 compute-0 NetworkManager[55434]: <info>  [1764916075.8042] manager: (tapa0d03e43-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec 05 06:27:55 compute-0 kernel: tapa0d03e43-30: entered promiscuous mode
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.803 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.806 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.808 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0d03e43-30, col_values=(('external_ids', {'iface-id': '88888304-ddfb-428f-a56e-30bd63031520'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:27:55 compute-0 ovn_controller[95223]: 2025-12-05T06:27:55Z|00169|binding|INFO|Releasing lport 88888304-ddfb-428f-a56e-30bd63031520 from this chassis (sb_readonly=0)
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.809 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.809 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 nova_compute[186329]: 2025-12-05 06:27:55.821 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.821 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4f75531d-675b-4252-be60-237c32ff16eb]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.822 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.822 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.822 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a0d03e43-34ea-4028-a2db-d5149175a508 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.822 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.822 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d980b836-3922-4505-8bf0-4aa30c70437e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.823 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.823 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ddca5428-eb77-4cb0-8552-4e2ce9cc4088]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.823 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:27:55 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:27:55.824 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'env', 'PROCESS_TAG=haproxy-a0d03e43-34ea-4028-a2db-d5149175a508', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a0d03e43-34ea-4028-a2db-d5149175a508.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:27:56 compute-0 podman[213143]: 2025-12-05 06:27:56.136050791 +0000 UTC m=+0.033602119 container create a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:27:56 compute-0 systemd[1]: Started libpod-conmon-a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565.scope.
Dec 05 06:27:56 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:27:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f47795765a033e8fd49abafcdd4d5321f5212ca762abfb2175d8b4927130099e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:27:56 compute-0 podman[213143]: 2025-12-05 06:27:56.192616917 +0000 UTC m=+0.090168265 container init a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:27:56 compute-0 podman[213143]: 2025-12-05 06:27:56.198155271 +0000 UTC m=+0.095706599 container start a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 05 06:27:56 compute-0 podman[213143]: 2025-12-05 06:27:56.120812951 +0000 UTC m=+0.018364299 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:27:56 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [NOTICE]   (213159) : New worker (213161) forked
Dec 05 06:27:56 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [NOTICE]   (213159) : Loading success.
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.244 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.298 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.299 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.352 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.552 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.553 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.566 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.566 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5727MB free_disk=73.13736343383789GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.567 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.567 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:27:56 compute-0 nova_compute[186329]: 2025-12-05 06:27:56.626 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:57 compute-0 nova_compute[186329]: 2025-12-05 06:27:57.585 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:27:58 compute-0 nova_compute[186329]: 2025-12-05 06:27:58.089 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Updating resource usage from migration 0a0f0cf0-56b1-4d18-a6c6-58a5fb7e7ed2
Dec 05 06:27:58 compute-0 nova_compute[186329]: 2025-12-05 06:27:58.090 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Starting to track incoming migration 0a0f0cf0-56b1-4d18-a6c6-58a5fb7e7ed2 with flavor cb13e320-971c-46c2-a935-d695f3631bf8 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:27:58 compute-0 ovn_controller[95223]: 2025-12-05T06:27:58Z|00170|binding|INFO|Claiming lport 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 for this chassis.
Dec 05 06:27:58 compute-0 ovn_controller[95223]: 2025-12-05T06:27:58Z|00171|binding|INFO|6814c1d4-d0e4-49a9-92b0-3d640e6a64e2: Claiming fa:16:3e:77:88:29 10.100.0.10
Dec 05 06:27:58 compute-0 ovn_controller[95223]: 2025-12-05T06:27:58Z|00172|binding|INFO|Setting lport 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 up in Southbound
Dec 05 06:27:58 compute-0 nova_compute[186329]: 2025-12-05 06:27:58.906 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.118 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.623 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.623 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.624 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:27:56 up  1:05,  0 user,  load average: 0.05, 0.15, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.637 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.647 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.647 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.656 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.668 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:27:59 compute-0 nova_compute[186329]: 2025-12-05 06:27:59.700 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:27:59 compute-0 podman[196599]: time="2025-12-05T06:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:27:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:27:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3049 "" "Go-http-client/1.1"
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.204 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.612 186333 INFO nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Post operation of migration started
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.612 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.678 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.679 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.688 104041 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 181a7601-05df-4e53-8c20-e59562925885 with type ""
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.689 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:88:29 10.100.0.10'], port_security=['fa:16:3e:77:88:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'bbcb12e6-ebf7-49e2-847a-65f1b3a3266c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'f6ae1da8-70b1-4c7e-8cc3-a298827abbb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3a7080-e915-4f20-bdbc-22c4af619c01, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=6814c1d4-d0e4-49a9-92b0-3d640e6a64e2) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:28:00 compute-0 ovn_controller[95223]: 2025-12-05T06:28:00Z|00173|binding|INFO|Removing iface tap6814c1d4-d0 ovn-installed in OVS
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.689 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 in datapath a0d03e43-34ea-4028-a2db-d5149175a508 unbound from our chassis
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.690 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0d03e43-34ea-4028-a2db-d5149175a508, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:28:00 compute-0 ovn_controller[95223]: 2025-12-05T06:28:00Z|00174|binding|INFO|Removing lport 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 ovn-installed in OVS
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.690 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4482353d-2d6f-4835-a25a-2920487790c1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.691 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 namespace which is not needed anymore
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.691 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.703 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.709 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.709 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.746 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.746 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.747 186333 DEBUG nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:28:00 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [NOTICE]   (213159) : haproxy version is 3.0.5-8e879a5
Dec 05 06:28:00 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [NOTICE]   (213159) : path to executable is /usr/sbin/haproxy
Dec 05 06:28:00 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [WARNING]  (213159) : Exiting Master process...
Dec 05 06:28:00 compute-0 podman[213196]: 2025-12-05 06:28:00.775741022 +0000 UTC m=+0.020177189 container kill a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Dec 05 06:28:00 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [ALERT]    (213159) : Current worker (213161) exited with code 143 (Terminated)
Dec 05 06:28:00 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213155]: [WARNING]  (213159) : All workers exited. Exiting... (0)
Dec 05 06:28:00 compute-0 systemd[1]: libpod-a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565.scope: Deactivated successfully.
Dec 05 06:28:00 compute-0 podman[213207]: 2025-12-05 06:28:00.812441086 +0000 UTC m=+0.021370323 container died a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:28:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565-userdata-shm.mount: Deactivated successfully.
Dec 05 06:28:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-f47795765a033e8fd49abafcdd4d5321f5212ca762abfb2175d8b4927130099e-merged.mount: Deactivated successfully.
Dec 05 06:28:00 compute-0 podman[213207]: 2025-12-05 06:28:00.835110148 +0000 UTC m=+0.044039385 container cleanup a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 06:28:00 compute-0 systemd[1]: libpod-conmon-a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565.scope: Deactivated successfully.
Dec 05 06:28:00 compute-0 podman[213209]: 2025-12-05 06:28:00.844195531 +0000 UTC m=+0.048974072 container remove a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.855 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0a2f0181-92b0-42a8-bada-ffd407a30675]: (4, ("Fri Dec  5 06:28:00 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 (a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565)\na906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565\nFri Dec  5 06:28:00 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 (a906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565)\na906f7ef882318ae1c3d35150e8e4b9119113d90f6793de78379810cd4748565\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.855 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7871b4e7-a332-41d7-8d03-af91da463f45]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.856 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.856 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f03dbf4f-e464-44b7-b45d-7ef1703bfb22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.856 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d03e43-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.858 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:00 compute-0 kernel: tapa0d03e43-30: left promiscuous mode
Dec 05 06:28:00 compute-0 nova_compute[186329]: 2025-12-05 06:28:00.870 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.873 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b446b2d3-1c70-4a2b-9ff1-4e3f2e5b46c5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.883 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4f399be0-55d4-4f56-8175-cbf4fb8aedc0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.883 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c29856c6-ab4d-4195-a2b8-52eceb163c7f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.897 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ef30ea67-d967-45c0-9ac7-4a1b92e0886a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395418, 'reachable_time': 32406, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213231, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:00 compute-0 systemd[1]: run-netns-ovnmeta\x2da0d03e43\x2d34ea\x2d4028\x2da2db\x2dd5149175a508.mount: Deactivated successfully.
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.900 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:28:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:00.900 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[d65ad550-358f-471f-9a06-99d3af19889b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.251 186333 WARNING neutronclient.v2_0.client [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: ERROR   06:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: ERROR   06:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: ERROR   06:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: ERROR   06:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:28:01 compute-0 openstack_network_exporter[198686]: ERROR   06:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.623 186333 INFO nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Port 6814c1d4-d0e4-49a9-92b0-3d640e6a64e2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.624 186333 DEBUG nova.network.neutron [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.627 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.708 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.708 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.708 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:01 compute-0 nova_compute[186329]: 2025-12-05 06:28:01.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.128 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.638 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.639 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.639 186333 DEBUG oslo_concurrency.lockutils [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.641 186333 INFO nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:28:02 compute-0 virtqemud[186605]: Domain id=15 name='instance-00000012' uuid=bbcb12e6-ebf7-49e2-847a-65f1b3a3266c is tainted: custom-monitor
Dec 05 06:28:02 compute-0 nova_compute[186329]: 2025-12-05 06:28:02.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:02 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:02.734 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:28:03 compute-0 podman[213237]: 2025-12-05 06:28:03.461856063 +0000 UTC m=+0.037366668 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:28:03 compute-0 podman[213236]: 2025-12-05 06:28:03.491445356 +0000 UTC m=+0.067136672 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:28:03 compute-0 nova_compute[186329]: 2025-12-05 06:28:03.647 186333 INFO nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:28:03 compute-0 nova_compute[186329]: 2025-12-05 06:28:03.907 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:04 compute-0 nova_compute[186329]: 2025-12-05 06:28:04.651 186333 INFO nova.virt.libvirt.driver [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:28:04 compute-0 nova_compute[186329]: 2025-12-05 06:28:04.654 186333 DEBUG nova.compute.manager [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:28:05 compute-0 nova_compute[186329]: 2025-12-05 06:28:05.160 186333 DEBUG nova.objects.instance [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.629 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server [None req-785cfdf8-239e-4e33-973e-3cf70c84496e e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Exception during message handling: nova.exception_Remote.InstanceNotFound_Remote: Instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c could not be found.
Dec 05 06:28:06 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:28:06 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:28:06 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:28:06 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:28:06 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:28:06 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:28:06 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:28:06 compute-0 nova_compute[186329]:     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:28:06 compute-0 nova_compute[186329]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:28:06 compute-0 nova_compute[186329]:     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 
Dec 05 06:28:06 compute-0 nova_compute[186329]: nova.exception.InstanceNotFound: Instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c could not be found.
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     updates, result = self.indirection_api.object_action(
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     result = self.transport._send(
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return self._driver.send(target, ctxt, message,
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     raise result
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server nova.exception_Remote.InstanceNotFound_Remote: Instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c could not be found.
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return getattr(target, method)(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return fn(self, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     return f(context, *args, **kwargs)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2301, in instance_update_and_get_original
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     instance_ref = _instance_get_by_uuid(context, instance_uuid,
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 1411, in _instance_get_by_uuid
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server     raise exception.InstanceNotFound(instance_id=uuid)
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server nova.exception.InstanceNotFound: Instance bbcb12e6-ebf7-49e2-847a-65f1b3a3266c could not be found.
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:06 compute-0 nova_compute[186329]: 2025-12-05 06:28:06.691 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:28:08 compute-0 nova_compute[186329]: 2025-12-05 06:28:08.908 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:11 compute-0 nova_compute[186329]: 2025-12-05 06:28:11.631 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:13 compute-0 podman[213281]: 2025-12-05 06:28:13.460146697 +0000 UTC m=+0.045683696 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Dec 05 06:28:13 compute-0 podman[213283]: 2025-12-05 06:28:13.466442364 +0000 UTC m=+0.048353075 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:28:13 compute-0 podman[213282]: 2025-12-05 06:28:13.466883554 +0000 UTC m=+0.050431693 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec 05 06:28:13 compute-0 nova_compute[186329]: 2025-12-05 06:28:13.909 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:16 compute-0 nova_compute[186329]: 2025-12-05 06:28:16.633 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:18 compute-0 nova_compute[186329]: 2025-12-05 06:28:18.910 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:21 compute-0 nova_compute[186329]: 2025-12-05 06:28:21.635 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:23 compute-0 nova_compute[186329]: 2025-12-05 06:28:23.912 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:26 compute-0 nova_compute[186329]: 2025-12-05 06:28:26.636 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:28 compute-0 nova_compute[186329]: 2025-12-05 06:28:28.914 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:29.514 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:28:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:29.514 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:28:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:28:29.514 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:28:29 compute-0 podman[196599]: time="2025-12-05T06:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:28:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:28:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: ERROR   06:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: ERROR   06:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: ERROR   06:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: ERROR   06:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: ERROR   06:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:28:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:28:31 compute-0 nova_compute[186329]: 2025-12-05 06:28:31.638 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:33 compute-0 nova_compute[186329]: 2025-12-05 06:28:33.915 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:34 compute-0 podman[213341]: 2025-12-05 06:28:34.459694843 +0000 UTC m=+0.038294143 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:28:34 compute-0 podman[213340]: 2025-12-05 06:28:34.479339338 +0000 UTC m=+0.059740686 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:28:36 compute-0 nova_compute[186329]: 2025-12-05 06:28:36.640 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:38 compute-0 nova_compute[186329]: 2025-12-05 06:28:38.916 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:41 compute-0 nova_compute[186329]: 2025-12-05 06:28:41.642 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:43 compute-0 nova_compute[186329]: 2025-12-05 06:28:43.917 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:44 compute-0 podman[213385]: 2025-12-05 06:28:44.473482784 +0000 UTC m=+0.049342494 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm)
Dec 05 06:28:44 compute-0 podman[213386]: 2025-12-05 06:28:44.485731425 +0000 UTC m=+0.057530971 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Dec 05 06:28:44 compute-0 podman[213384]: 2025-12-05 06:28:44.495767114 +0000 UTC m=+0.075740870 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 05 06:28:46 compute-0 nova_compute[186329]: 2025-12-05 06:28:46.645 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:48 compute-0 nova_compute[186329]: 2025-12-05 06:28:48.918 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:51 compute-0 nova_compute[186329]: 2025-12-05 06:28:51.646 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:53 compute-0 nova_compute[186329]: 2025-12-05 06:28:53.716 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Creating tmpfile /var/lib/nova/instances/tmp64ps9sdk to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:28:53 compute-0 nova_compute[186329]: 2025-12-05 06:28:53.717 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:53 compute-0 nova_compute[186329]: 2025-12-05 06:28:53.801 186333 DEBUG nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp64ps9sdk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:28:53 compute-0 nova_compute[186329]: 2025-12-05 06:28:53.920 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:55 compute-0 nova_compute[186329]: 2025-12-05 06:28:55.704 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:55 compute-0 nova_compute[186329]: 2025-12-05 06:28:55.823 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:28:56 compute-0 nova_compute[186329]: 2025-12-05 06:28:56.649 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:56 compute-0 nova_compute[186329]: 2025-12-05 06:28:56.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:56 compute-0 nova_compute[186329]: 2025-12-05 06:28:56.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:28:57 compute-0 nova_compute[186329]: 2025-12-05 06:28:57.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:28:57 compute-0 nova_compute[186329]: 2025-12-05 06:28:57.219 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:28:57 compute-0 nova_compute[186329]: 2025-12-05 06:28:57.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:28:57 compute-0 nova_compute[186329]: 2025-12-05 06:28:57.220 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.244 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.295 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.296 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.335 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.337 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.377 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.377 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.416 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.592 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.593 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.605 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.606 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5587MB free_disk=73.10902786254883GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.606 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.606 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:28:58 compute-0 nova_compute[186329]: 2025-12-05 06:28:58.922 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:28:59 compute-0 nova_compute[186329]: 2025-12-05 06:28:59.550 186333 DEBUG nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp64ps9sdk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='152d427b-2c4a-41ab-9e0a-8becaa1a46bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:28:59 compute-0 nova_compute[186329]: 2025-12-05 06:28:59.619 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance 152d427b-2c4a-41ab-9e0a-8becaa1a46bc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:28:59 compute-0 podman[196599]: time="2025-12-05T06:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:28:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:28:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:29:00 compute-0 nova_compute[186329]: 2025-12-05 06:29:00.124 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Updating resource usage from migration 078082a7-e687-4e7a-9904-ad7412f16e42
Dec 05 06:29:00 compute-0 nova_compute[186329]: 2025-12-05 06:29:00.124 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Starting to track incoming migration 078082a7-e687-4e7a-9904-ad7412f16e42 with flavor c46c94de-dc67-4994-bf4a-0c08b88ab6b7 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Dec 05 06:29:00 compute-0 nova_compute[186329]: 2025-12-05 06:29:00.561 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:29:00 compute-0 nova_compute[186329]: 2025-12-05 06:29:00.561 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:29:00 compute-0 nova_compute[186329]: 2025-12-05 06:29:00.561 186333 DEBUG nova.network.neutron [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.065 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.151 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: ERROR   06:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: ERROR   06:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: ERROR   06:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: ERROR   06:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: ERROR   06:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:29:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.650 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.656 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 152d427b-2c4a-41ab-9e0a-8becaa1a46bc has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1152, 'VCPU': 1}}.
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.656 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.656 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=1664MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:28:58 up  1:06,  0 user,  load average: 0.04, 0.13, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.690 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:29:01 compute-0 nova_compute[186329]: 2025-12-05 06:29:01.893 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:02 compute-0 nova_compute[186329]: 2025-12-05 06:29:02.195 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:29:02 compute-0 nova_compute[186329]: 2025-12-05 06:29:02.601 186333 DEBUG nova.network.neutron [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Updating instance_info_cache with network_info: [{"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:29:02 compute-0 nova_compute[186329]: 2025-12-05 06:29:02.700 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:29:02 compute-0 nova_compute[186329]: 2025-12-05 06:29:02.701 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.104 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.111 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp64ps9sdk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='152d427b-2c4a-41ab-9e0a-8becaa1a46bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.112 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Creating instance directory: /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.112 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Creating disk.info with the contents: {'/var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk': 'qcow2', '/var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.112 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.113 186333 DEBUG nova.objects.instance [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 152d427b-2c4a-41ab-9e0a-8becaa1a46bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.616 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.619 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.624 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.664 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.665 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.665 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.665 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.668 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.668 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.706 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.707 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.725 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.725 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.060s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.726 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.766 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.766 186333 DEBUG nova.virt.disk.api [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.767 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.818 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.818 186333 DEBUG nova.virt.disk.api [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.819 186333 DEBUG nova.objects.instance [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 152d427b-2c4a-41ab-9e0a-8becaa1a46bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:29:03 compute-0 nova_compute[186329]: 2025-12-05 06:29:03.924 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.323 186333 DEBUG nova.objects.base [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<152d427b-2c4a-41ab-9e0a-8becaa1a46bc> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.324 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.340 186333 DEBUG oslo_concurrency.processutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.341 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.342 186333 DEBUG nova.virt.libvirt.vif [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1057592448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1057592448',id=20,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:28:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1152,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d7bd52c6f43e4c0fbd1009b9f0994d4c',ramdisk_id='',reservation_id='r-zzhh09hb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:28:23Z,user_data=None,user_id='f26b0764633e44508f3eff072931d01d',uuid=152d427b-2c4a-41ab-9e0a-8becaa1a46bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.342 186333 DEBUG nova.network.os_vif_util [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.343 186333 DEBUG nova.network.os_vif_util [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.343 186333 DEBUG os_vif [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.344 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.345 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.345 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.346 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.346 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '184b87ad-e016-5c31-9b8d-2517220b8790', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.347 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.348 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.350 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.350 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3848980-84, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.351 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf3848980-84, col_values=(('qos', UUID('f3620838-c1c5-4d76-95c5-50f7135ebedd')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.351 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf3848980-84, col_values=(('external_ids', {'iface-id': 'f3848980-840c-4db4-8a83-263a6a7ee341', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:b2:8b', 'vm-uuid': '152d427b-2c4a-41ab-9e0a-8becaa1a46bc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.352 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 NetworkManager[55434]: <info>  [1764916144.3530] manager: (tapf3848980-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.355 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.357 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.357 186333 INFO os_vif [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84')
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.357 186333 DEBUG nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.357 186333 DEBUG nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp64ps9sdk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='152d427b-2c4a-41ab-9e0a-8becaa1a46bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.358 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.597 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.696 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.696 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.696 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.697 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.697 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.697 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:04 compute-0 nova_compute[186329]: 2025-12-05 06:29:04.697 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:29:05 compute-0 nova_compute[186329]: 2025-12-05 06:29:05.007 186333 DEBUG nova.network.neutron [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Port f3848980-840c-4db4-8a83-263a6a7ee341 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:29:05 compute-0 nova_compute[186329]: 2025-12-05 06:29:05.014 186333 DEBUG nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp64ps9sdk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='152d427b-2c4a-41ab-9e0a-8becaa1a46bc',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:29:05 compute-0 podman[213472]: 2025-12-05 06:29:05.457259556 +0000 UTC m=+0.036624171 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:29:05 compute-0 podman[213471]: 2025-12-05 06:29:05.477110811 +0000 UTC m=+0.059144286 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:29:08 compute-0 kernel: tapf3848980-84: entered promiscuous mode
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.4868] manager: (tapf3848980-84): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Dec 05 06:29:08 compute-0 ovn_controller[95223]: 2025-12-05T06:29:08Z|00175|binding|INFO|Claiming lport f3848980-840c-4db4-8a83-263a6a7ee341 for this additional chassis.
Dec 05 06:29:08 compute-0 ovn_controller[95223]: 2025-12-05T06:29:08Z|00176|binding|INFO|f3848980-840c-4db4-8a83-263a6a7ee341: Claiming fa:16:3e:66:b2:8b 10.100.0.10
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.489 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.502 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:b2:8b 10.100.0.10'], port_security=['fa:16:3e:66:b2:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '152d427b-2c4a-41ab-9e0a-8becaa1a46bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f6ae1da8-70b1-4c7e-8cc3-a298827abbb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3a7080-e915-4f20-bdbc-22c4af619c01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f3848980-840c-4db4-8a83-263a6a7ee341) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:29:08 compute-0 ovn_controller[95223]: 2025-12-05T06:29:08Z|00177|binding|INFO|Setting lport f3848980-840c-4db4-8a83-263a6a7ee341 ovn-installed in OVS
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.503 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.503 104041 INFO neutron.agent.ovn.metadata.agent [-] Port f3848980-840c-4db4-8a83-263a6a7ee341 in datapath a0d03e43-34ea-4028-a2db-d5149175a508 unbound from our chassis
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.503 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.504 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 systemd-udevd[213531]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.512 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[99d68eb1-0259-495e-942a-2479c9afa96f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.513 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa0d03e43-31 in ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.516 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa0d03e43-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.516 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[572c62a5-b105-461c-84d7-0648493b707c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.517 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d6405937-3b35-48aa-9ba1-59cb1ed7261e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 systemd-machined[152967]: New machine qemu-16-instance-00000014.
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.5241] device (tapf3848980-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.5249] device (tapf3848980-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:29:08 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000014.
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.526 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[698f8014-5b36-4e6d-bc59-dec23cdab2d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.541 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4eaa8483-3c68-436b-94b4-950dc227d7f0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.560 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[0e4acf28-6e3a-4ac2-8ddd-9363ced45cee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.5642] manager: (tapa0d03e43-30): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.564 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[92afcb04-6030-4c45-b5c1-eb377b7cd2ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 systemd-udevd[213535]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.584 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0090fa-5c7f-4d1a-9ad8-eac65614386b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.586 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3210ad-b9e5-4ac2-98c0-fdcd9fcf3c41]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.6040] device (tapa0d03e43-30): carrier: link connected
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.607 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5a6d9a-3982-4140-988d-40af03a16dbb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.620 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3648411e-a7ff-4fb7-a50d-b84d91cf37e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0d03e43-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:f5:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402713, 'reachable_time': 38195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213556, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.630 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[36e83aa3-da43-457b-91b0-6a8e9236c659]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:f5eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 402713, 'tstamp': 402713}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213557, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.642 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c25879e6-07e8-4d87-aed2-110fdcf3b469]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0d03e43-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:f5:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402713, 'reachable_time': 38195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213558, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.666 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c00adae4-dd60-4f32-9fa8-8cc079b8f742]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.710 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[68bd4b11-731a-4f39-8337-ed9028617b0b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.711 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d03e43-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.711 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.712 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0d03e43-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:08 compute-0 kernel: tapa0d03e43-30: entered promiscuous mode
Dec 05 06:29:08 compute-0 NetworkManager[55434]: <info>  [1764916148.7138] manager: (tapa0d03e43-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.714 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.715 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0d03e43-30, col_values=(('external_ids', {'iface-id': '88888304-ddfb-428f-a56e-30bd63031520'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.716 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 ovn_controller[95223]: 2025-12-05T06:29:08Z|00178|binding|INFO|Releasing lport 88888304-ddfb-428f-a56e-30bd63031520 from this chassis (sb_readonly=0)
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.728 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.730 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2b946fad-3522-4034-8165-a98fe061f866]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.730 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.730 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.730 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for a0d03e43-34ea-4028-a2db-d5149175a508 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.731 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.731 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f292c847-82ea-4556-9fe2-a64bc1db39ad]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.731 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.731 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0bfbeea7-1cf5-462c-b0f1-d85c1a2282ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.732 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID a0d03e43-34ea-4028-a2db-d5149175a508
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:29:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:08.732 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'env', 'PROCESS_TAG=haproxy-a0d03e43-34ea-4028-a2db-d5149175a508', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a0d03e43-34ea-4028-a2db-d5149175a508.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:29:08 compute-0 nova_compute[186329]: 2025-12-05 06:29:08.924 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:09 compute-0 podman[213594]: 2025-12-05 06:29:09.0386018 +0000 UTC m=+0.033859514 container create 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 05 06:29:09 compute-0 systemd[1]: Started libpod-conmon-3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3.scope.
Dec 05 06:29:09 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:29:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5353e8db664a8bd790a3fa92ebb600be7e909bab5934d0ec2094aeb4c0f3278/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:29:09 compute-0 podman[213594]: 2025-12-05 06:29:09.109079984 +0000 UTC m=+0.104337717 container init 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Dec 05 06:29:09 compute-0 podman[213594]: 2025-12-05 06:29:09.114756698 +0000 UTC m=+0.110014411 container start 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:29:09 compute-0 podman[213594]: 2025-12-05 06:29:09.023533128 +0000 UTC m=+0.018790842 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:29:09 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [NOTICE]   (213610) : New worker (213612) forked
Dec 05 06:29:09 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [NOTICE]   (213610) : Loading success.
Dec 05 06:29:09 compute-0 nova_compute[186329]: 2025-12-05 06:29:09.352 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:11.784 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:29:11 compute-0 nova_compute[186329]: 2025-12-05 06:29:11.785 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:11.786 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:29:11 compute-0 ovn_controller[95223]: 2025-12-05T06:29:11Z|00179|binding|INFO|Claiming lport f3848980-840c-4db4-8a83-263a6a7ee341 for this chassis.
Dec 05 06:29:11 compute-0 ovn_controller[95223]: 2025-12-05T06:29:11Z|00180|binding|INFO|f3848980-840c-4db4-8a83-263a6a7ee341: Claiming fa:16:3e:66:b2:8b 10.100.0.10
Dec 05 06:29:11 compute-0 ovn_controller[95223]: 2025-12-05T06:29:11Z|00181|binding|INFO|Setting lport f3848980-840c-4db4-8a83-263a6a7ee341 up in Southbound
Dec 05 06:29:12 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:12.787 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:12 compute-0 nova_compute[186329]: 2025-12-05 06:29:12.903 186333 INFO nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Post operation of migration started
Dec 05 06:29:12 compute-0 nova_compute[186329]: 2025-12-05 06:29:12.903 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.599 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.600 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.668 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.668 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.668 186333 DEBUG nova.network.neutron [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:29:13 compute-0 nova_compute[186329]: 2025-12-05 06:29:13.926 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:14 compute-0 nova_compute[186329]: 2025-12-05 06:29:14.172 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:14 compute-0 nova_compute[186329]: 2025-12-05 06:29:14.353 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:14 compute-0 nova_compute[186329]: 2025-12-05 06:29:14.807 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:14 compute-0 nova_compute[186329]: 2025-12-05 06:29:14.924 186333 DEBUG nova.network.neutron [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Updating instance_info_cache with network_info: [{"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:29:15 compute-0 nova_compute[186329]: 2025-12-05 06:29:15.428 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-152d427b-2c4a-41ab-9e0a-8becaa1a46bc" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:29:15 compute-0 podman[213631]: 2025-12-05 06:29:15.476418347 +0000 UTC m=+0.051757736 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:29:15 compute-0 podman[213629]: 2025-12-05 06:29:15.479105489 +0000 UTC m=+0.057461991 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 06:29:15 compute-0 podman[213630]: 2025-12-05 06:29:15.499400929 +0000 UTC m=+0.076891283 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, vcs-type=git, architecture=x86_64, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 06:29:15 compute-0 nova_compute[186329]: 2025-12-05 06:29:15.938 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:15 compute-0 nova_compute[186329]: 2025-12-05 06:29:15.939 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:15 compute-0 nova_compute[186329]: 2025-12-05 06:29:15.939 186333 DEBUG oslo_concurrency.lockutils [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:15 compute-0 nova_compute[186329]: 2025-12-05 06:29:15.942 186333 INFO nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:29:15 compute-0 virtqemud[186605]: Domain id=16 name='instance-00000014' uuid=152d427b-2c4a-41ab-9e0a-8becaa1a46bc is tainted: custom-monitor
Dec 05 06:29:16 compute-0 nova_compute[186329]: 2025-12-05 06:29:16.946 186333 INFO nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:29:17 compute-0 nova_compute[186329]: 2025-12-05 06:29:17.951 186333 INFO nova.virt.libvirt.driver [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:29:17 compute-0 nova_compute[186329]: 2025-12-05 06:29:17.954 186333 DEBUG nova.compute.manager [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:29:18 compute-0 nova_compute[186329]: 2025-12-05 06:29:18.462 186333 DEBUG nova.objects.instance [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:29:18 compute-0 nova_compute[186329]: 2025-12-05 06:29:18.928 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:19 compute-0 nova_compute[186329]: 2025-12-05 06:29:19.355 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:19 compute-0 nova_compute[186329]: 2025-12-05 06:29:19.473 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:19 compute-0 nova_compute[186329]: 2025-12-05 06:29:19.564 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:19 compute-0 nova_compute[186329]: 2025-12-05 06:29:19.565 186333 WARNING neutronclient.v2_0.client [None req-74973af7-451b-42ff-9fc3-387f39ba8789 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:23 compute-0 nova_compute[186329]: 2025-12-05 06:29:23.929 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:24 compute-0 nova_compute[186329]: 2025-12-05 06:29:24.357 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:28 compute-0 nova_compute[186329]: 2025-12-05 06:29:28.930 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:29 compute-0 nova_compute[186329]: 2025-12-05 06:29:29.357 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:29.515 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:29.516 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:29.516 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:29 compute-0 podman[196599]: time="2025-12-05T06:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:29:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:29:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3052 "" "Go-http-client/1.1"
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: ERROR   06:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: ERROR   06:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: ERROR   06:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: ERROR   06:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: ERROR   06:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:29:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:29:33 compute-0 nova_compute[186329]: 2025-12-05 06:29:33.931 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:34 compute-0 nova_compute[186329]: 2025-12-05 06:29:34.359 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.034 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Acquiring lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.035 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.035 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Acquiring lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.035 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.035 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.042 186333 INFO nova.compute.manager [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Terminating instance
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.550 186333 DEBUG nova.compute.manager [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:29:35 compute-0 kernel: tapf3848980-84 (unregistering): left promiscuous mode
Dec 05 06:29:35 compute-0 NetworkManager[55434]: <info>  [1764916175.5748] device (tapf3848980-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.580 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.581 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 ovn_controller[95223]: 2025-12-05T06:29:35Z|00182|binding|INFO|Releasing lport f3848980-840c-4db4-8a83-263a6a7ee341 from this chassis (sb_readonly=0)
Dec 05 06:29:35 compute-0 ovn_controller[95223]: 2025-12-05T06:29:35Z|00183|binding|INFO|Setting lport f3848980-840c-4db4-8a83-263a6a7ee341 down in Southbound
Dec 05 06:29:35 compute-0 ovn_controller[95223]: 2025-12-05T06:29:35Z|00184|binding|INFO|Removing iface tapf3848980-84 ovn-installed in OVS
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.593 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.593 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:b2:8b 10.100.0.10'], port_security=['fa:16:3e:66:b2:8b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '152d427b-2c4a-41ab-9e0a-8becaa1a46bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0d03e43-34ea-4028-a2db-d5149175a508', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd7bd52c6f43e4c0fbd1009b9f0994d4c', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'f6ae1da8-70b1-4c7e-8cc3-a298827abbb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea3a7080-e915-4f20-bdbc-22c4af619c01, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=f3848980-840c-4db4-8a83-263a6a7ee341) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.593 104041 INFO neutron.agent.ovn.metadata.agent [-] Port f3848980-840c-4db4-8a83-263a6a7ee341 in datapath a0d03e43-34ea-4028-a2db-d5149175a508 unbound from our chassis
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.595 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0d03e43-34ea-4028-a2db-d5149175a508, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.595 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fe20a4e6-bdda-48f9-9cdb-9b1ca80a6f93]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.596 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 namespace which is not needed anymore
Dec 05 06:29:35 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec 05 06:29:35 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Consumed 2.970s CPU time.
Dec 05 06:29:35 compute-0 systemd-machined[152967]: Machine qemu-16-instance-00000014 terminated.
Dec 05 06:29:35 compute-0 podman[213681]: 2025-12-05 06:29:35.655445951 +0000 UTC m=+0.075842952 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:29:35 compute-0 podman[213680]: 2025-12-05 06:29:35.678566562 +0000 UTC m=+0.104948336 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.679 186333 DEBUG nova.compute.manager [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Received event network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.682 186333 DEBUG oslo_concurrency.lockutils [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.682 186333 DEBUG oslo_concurrency.lockutils [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.682 186333 DEBUG oslo_concurrency.lockutils [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.682 186333 DEBUG nova.compute.manager [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] No waiting events found dispatching network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.682 186333 DEBUG nova.compute.manager [req-9137bca8-a7e0-4fcc-8c8b-26df75541647 req-c29fe6d8-aaac-4cb1-a36a-de61acec06fa fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Received event network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:29:35 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [NOTICE]   (213610) : haproxy version is 3.0.5-8e879a5
Dec 05 06:29:35 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [NOTICE]   (213610) : path to executable is /usr/sbin/haproxy
Dec 05 06:29:35 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [WARNING]  (213610) : Exiting Master process...
Dec 05 06:29:35 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [ALERT]    (213610) : Current worker (213612) exited with code 143 (Terminated)
Dec 05 06:29:35 compute-0 neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508[213606]: [WARNING]  (213610) : All workers exited. Exiting... (0)
Dec 05 06:29:35 compute-0 podman[213746]: 2025-12-05 06:29:35.698091895 +0000 UTC m=+0.022397012 container kill 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:29:35 compute-0 systemd[1]: libpod-3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3.scope: Deactivated successfully.
Dec 05 06:29:35 compute-0 podman[213758]: 2025-12-05 06:29:35.729154409 +0000 UTC m=+0.017568845 container died 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 05 06:29:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3-userdata-shm.mount: Deactivated successfully.
Dec 05 06:29:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-f5353e8db664a8bd790a3fa92ebb600be7e909bab5934d0ec2094aeb4c0f3278-merged.mount: Deactivated successfully.
Dec 05 06:29:35 compute-0 podman[213758]: 2025-12-05 06:29:35.75249312 +0000 UTC m=+0.040907546 container cleanup 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 05 06:29:35 compute-0 systemd[1]: libpod-conmon-3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3.scope: Deactivated successfully.
Dec 05 06:29:35 compute-0 podman[213760]: 2025-12-05 06:29:35.762278539 +0000 UTC m=+0.045519888 container remove 3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.767 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1fddf2ea-7152-454a-8991-399ee938d979]: (4, ("Fri Dec  5 06:29:35 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 (3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3)\n3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3\nFri Dec  5 06:29:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 (3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3)\n3fafa46ac2f5667106886fff9b7fa532f7ea7e33bf4edd3cac67260ae8516af3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.770 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa5ec10-d0bc-40c3-8011-8f0ee914f019]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.770 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0d03e43-34ea-4028-a2db-d5149175a508.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.771 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9af084-6e4b-477c-888e-e857fb92b0ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.771 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d03e43-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.773 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 kernel: tapa0d03e43-30: left promiscuous mode
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.785 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.788 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.790 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[00344a6a-911e-4ac7-8426-c8b3563aed9e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.796 186333 INFO nova.virt.libvirt.driver [-] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Instance destroyed successfully.
Dec 05 06:29:35 compute-0 nova_compute[186329]: 2025-12-05 06:29:35.796 186333 DEBUG nova.objects.instance [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lazy-loading 'resources' on Instance uuid 152d427b-2c4a-41ab-9e0a-8becaa1a46bc obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.800 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9c95f5cc-dc09-4b1b-910f-c49d2e6735b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.800 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbc3379-c92f-4898-9d67-3a7cdb03a9e3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.811 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec5452c-2e49-40dd-b513-d9f65678033a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 402709, 'reachable_time': 28875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213801, 'error': None, 'target': 'ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:35 compute-0 systemd[1]: run-netns-ovnmeta\x2da0d03e43\x2d34ea\x2d4028\x2da2db\x2dd5149175a508.mount: Deactivated successfully.
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.813 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a0d03e43-34ea-4028-a2db-d5149175a508 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:29:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:35.814 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[9058dfc1-3f5c-45d0-9a3f-5fd784eb34d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.301 186333 DEBUG nova.virt.libvirt.vif [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:28:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1057592448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1057592448',id=20,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:28:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1152,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d7bd52c6f43e4c0fbd1009b9f0994d4c',ramdisk_id='',reservation_id='r-zzhh09hb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1379393409-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:29:18Z,user_data=None,user_id='f26b0764633e44508f3eff072931d01d',uuid=152d427b-2c4a-41ab-9e0a-8becaa1a46bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.301 186333 DEBUG nova.network.os_vif_util [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Converting VIF {"id": "f3848980-840c-4db4-8a83-263a6a7ee341", "address": "fa:16:3e:66:b2:8b", "network": {"id": "a0d03e43-34ea-4028-a2db-d5149175a508", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-1338527276-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2a604f2bba734821844adb92d1c578a7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3848980-84", "ovs_interfaceid": "f3848980-840c-4db4-8a83-263a6a7ee341", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.302 186333 DEBUG nova.network.os_vif_util [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.302 186333 DEBUG os_vif [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.303 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.303 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3848980-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.304 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.306 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.307 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.307 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f3620838-c1c5-4d76-95c5-50f7135ebedd) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.307 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.308 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.309 186333 INFO os_vif [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:b2:8b,bridge_name='br-int',has_traffic_filtering=True,id=f3848980-840c-4db4-8a83-263a6a7ee341,network=Network(a0d03e43-34ea-4028-a2db-d5149175a508),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3848980-84')
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.310 186333 INFO nova.virt.libvirt.driver [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Deleting instance files /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc_del
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.310 186333 INFO nova.virt.libvirt.driver [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Deletion of /var/lib/nova/instances/152d427b-2c4a-41ab-9e0a-8becaa1a46bc_del complete
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.817 186333 INFO nova.compute.manager [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Took 1.27 seconds to destroy the instance on the hypervisor.
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.818 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.818 186333 DEBUG nova.compute.manager [-] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.818 186333 DEBUG nova.network.neutron [-] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.819 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:36 compute-0 nova_compute[186329]: 2025-12-05 06:29:36.882 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.584 186333 DEBUG nova.network.neutron [-] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.721 186333 DEBUG nova.compute.manager [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Received event network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.721 186333 DEBUG oslo_concurrency.lockutils [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.722 186333 DEBUG oslo_concurrency.lockutils [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.722 186333 DEBUG oslo_concurrency.lockutils [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.722 186333 DEBUG nova.compute.manager [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] No waiting events found dispatching network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.722 186333 DEBUG nova.compute.manager [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Received event network-vif-unplugged-f3848980-840c-4db4-8a83-263a6a7ee341 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:29:37 compute-0 nova_compute[186329]: 2025-12-05 06:29:37.722 186333 DEBUG nova.compute.manager [req-c13b93fe-c724-400c-8bdf-ef045667617d req-7b12c39e-64ac-46d2-a2eb-f9a146bce2d3 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Received event network-vif-deleted-f3848980-840c-4db4-8a83-263a6a7ee341 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.088 186333 INFO nova.compute.manager [-] [instance: 152d427b-2c4a-41ab-9e0a-8becaa1a46bc] Took 1.27 seconds to deallocate network for instance.
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.600 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.600 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.605 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.626 186333 INFO nova.scheduler.client.report [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Deleted allocations for instance 152d427b-2c4a-41ab-9e0a-8becaa1a46bc
Dec 05 06:29:38 compute-0 nova_compute[186329]: 2025-12-05 06:29:38.932 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:39 compute-0 nova_compute[186329]: 2025-12-05 06:29:39.643 186333 DEBUG oslo_concurrency.lockutils [None req-a7bfc4ec-4971-4314-bdbb-1612d0f5d08e f26b0764633e44508f3eff072931d01d d7bd52c6f43e4c0fbd1009b9f0994d4c - - default default] Lock "152d427b-2c4a-41ab-9e0a-8becaa1a46bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.609s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:41 compute-0 nova_compute[186329]: 2025-12-05 06:29:41.308 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:43 compute-0 nova_compute[186329]: 2025-12-05 06:29:43.933 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:45 compute-0 nova_compute[186329]: 2025-12-05 06:29:45.188 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:46 compute-0 nova_compute[186329]: 2025-12-05 06:29:46.309 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:46 compute-0 podman[213802]: 2025-12-05 06:29:46.462928714 +0000 UTC m=+0.047501693 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:29:46 compute-0 podman[213803]: 2025-12-05 06:29:46.49300283 +0000 UTC m=+0.076030625 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9)
Dec 05 06:29:46 compute-0 podman[213804]: 2025-12-05 06:29:46.493155598 +0000 UTC m=+0.073955914 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd)
Dec 05 06:29:48 compute-0 nova_compute[186329]: 2025-12-05 06:29:48.934 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:51 compute-0 nova_compute[186329]: 2025-12-05 06:29:51.311 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:51 compute-0 nova_compute[186329]: 2025-12-05 06:29:51.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:53 compute-0 nova_compute[186329]: 2025-12-05 06:29:53.935 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:54.845 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:d2:62 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32d36f4d4d354e18b9270b2f2f540379', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d9989e5b-035f-46c9-8e0e-5fd96cf594af) old=Port_Binding(mac=['fa:16:3e:73:d2:62'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '32d36f4d4d354e18b9270b2f2f540379', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:29:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:54.846 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d9989e5b-035f-46c9-8e0e-5fd96cf594af in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 updated
Dec 05 06:29:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:54.847 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:29:54 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:29:54.848 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb613fd-4041-4dda-a7ec-140c5f753ae8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:29:56 compute-0 nova_compute[186329]: 2025-12-05 06:29:56.312 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:29:58 compute-0 nova_compute[186329]: 2025-12-05 06:29:58.936 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:29:59 compute-0 nova_compute[186329]: 2025-12-05 06:29:59.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:29:59 compute-0 nova_compute[186329]: 2025-12-05 06:29:59.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:29:59 compute-0 nova_compute[186329]: 2025-12-05 06:29:59.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:29:59 compute-0 nova_compute[186329]: 2025-12-05 06:29:59.224 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:29:59 compute-0 podman[196599]: time="2025-12-05T06:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:29:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:29:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.257 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.299 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.300 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.341 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.344 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.384 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.385 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.425 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.604 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.605 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.619 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.619 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5525MB free_disk=73.10901641845703GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.620 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:30:00 compute-0 nova_compute[186329]: 2025-12-05 06:30:00.620 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:30:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:00.700 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:c2:b3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4af2a19e-0ded-4e11-b108-0ee5d77bca3b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4af2a19e-0ded-4e11-b108-0ee5d77bca3b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1f300e0e-0831-4b3d-bd94-4ddbe3286eeb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4d369cf2-cbf2-44e5-b359-c3827c4cb2a8) old=Port_Binding(mac=['fa:16:3e:d7:c2:b3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4af2a19e-0ded-4e11-b108-0ee5d77bca3b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4af2a19e-0ded-4e11-b108-0ee5d77bca3b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:30:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:00.701 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4d369cf2-cbf2-44e5-b359-c3827c4cb2a8 in datapath 4af2a19e-0ded-4e11-b108-0ee5d77bca3b updated
Dec 05 06:30:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:00.702 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4af2a19e-0ded-4e11-b108-0ee5d77bca3b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:30:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:00.703 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dc04fa77-0eb6-40e2-af02-2bbd82834662]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:30:01 compute-0 nova_compute[186329]: 2025-12-05 06:30:01.315 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: ERROR   06:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: ERROR   06:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: ERROR   06:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: ERROR   06:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: ERROR   06:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:30:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:30:01 compute-0 anacron[124195]: Job `cron.weekly' started
Dec 05 06:30:01 compute-0 anacron[124195]: Job `cron.weekly' terminated
Dec 05 06:30:02 compute-0 nova_compute[186329]: 2025-12-05 06:30:02.171 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:30:02 compute-0 nova_compute[186329]: 2025-12-05 06:30:02.171 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:30:02 compute-0 nova_compute[186329]: 2025-12-05 06:30:02.172 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:30:00 up  1:07,  0 user,  load average: 0.01, 0.11, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:30:02 compute-0 nova_compute[186329]: 2025-12-05 06:30:02.201 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:30:02 compute-0 nova_compute[186329]: 2025-12-05 06:30:02.706 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:30:03 compute-0 nova_compute[186329]: 2025-12-05 06:30:03.211 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:30:03 compute-0 nova_compute[186329]: 2025-12-05 06:30:03.211 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.591s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:30:03 compute-0 nova_compute[186329]: 2025-12-05 06:30:03.939 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:06 compute-0 nova_compute[186329]: 2025-12-05 06:30:06.211 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:06 compute-0 nova_compute[186329]: 2025-12-05 06:30:06.212 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:06 compute-0 nova_compute[186329]: 2025-12-05 06:30:06.212 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:06 compute-0 nova_compute[186329]: 2025-12-05 06:30:06.212 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:06 compute-0 nova_compute[186329]: 2025-12-05 06:30:06.317 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:06 compute-0 podman[213872]: 2025-12-05 06:30:06.477893318 +0000 UTC m=+0.063649365 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 05 06:30:06 compute-0 podman[213873]: 2025-12-05 06:30:06.477982546 +0000 UTC m=+0.062425694 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:30:07 compute-0 nova_compute[186329]: 2025-12-05 06:30:07.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:07 compute-0 nova_compute[186329]: 2025-12-05 06:30:07.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:30:08 compute-0 nova_compute[186329]: 2025-12-05 06:30:08.214 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:30:08 compute-0 nova_compute[186329]: 2025-12-05 06:30:08.941 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:10 compute-0 nova_compute[186329]: 2025-12-05 06:30:10.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:10 compute-0 nova_compute[186329]: 2025-12-05 06:30:10.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:30:11 compute-0 nova_compute[186329]: 2025-12-05 06:30:11.319 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:13 compute-0 nova_compute[186329]: 2025-12-05 06:30:13.943 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:14 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:14.832 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:30:14 compute-0 nova_compute[186329]: 2025-12-05 06:30:14.833 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:14 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:14.833 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:30:16 compute-0 nova_compute[186329]: 2025-12-05 06:30:16.319 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:17 compute-0 podman[213919]: 2025-12-05 06:30:17.467474273 +0000 UTC m=+0.044192532 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4)
Dec 05 06:30:17 compute-0 podman[213918]: 2025-12-05 06:30:17.472404182 +0000 UTC m=+0.050229502 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, vcs-type=git)
Dec 05 06:30:17 compute-0 podman[213917]: 2025-12-05 06:30:17.472464356 +0000 UTC m=+0.051911797 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:30:18 compute-0 nova_compute[186329]: 2025-12-05 06:30:18.944 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:18 compute-0 ovn_controller[95223]: 2025-12-05T06:30:18Z|00185|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 05 06:30:21 compute-0 nova_compute[186329]: 2025-12-05 06:30:21.321 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:21 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:21.834 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:30:23 compute-0 nova_compute[186329]: 2025-12-05 06:30:23.946 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:26 compute-0 nova_compute[186329]: 2025-12-05 06:30:26.323 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:28 compute-0 nova_compute[186329]: 2025-12-05 06:30:28.948 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:29.517 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:30:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:29.517 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:30:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:30:29.517 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:30:29 compute-0 podman[196599]: time="2025-12-05T06:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:30:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:30:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 05 06:30:31 compute-0 nova_compute[186329]: 2025-12-05 06:30:31.325 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: ERROR   06:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: ERROR   06:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: ERROR   06:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: ERROR   06:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: ERROR   06:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:30:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:30:33 compute-0 nova_compute[186329]: 2025-12-05 06:30:33.949 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:36 compute-0 nova_compute[186329]: 2025-12-05 06:30:36.327 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:37 compute-0 podman[213969]: 2025-12-05 06:30:37.457456033 +0000 UTC m=+0.039792119 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:30:37 compute-0 podman[213968]: 2025-12-05 06:30:37.472759236 +0000 UTC m=+0.057834201 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 05 06:30:38 compute-0 nova_compute[186329]: 2025-12-05 06:30:38.950 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:41 compute-0 nova_compute[186329]: 2025-12-05 06:30:41.329 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:43 compute-0 nova_compute[186329]: 2025-12-05 06:30:43.951 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:46 compute-0 nova_compute[186329]: 2025-12-05 06:30:46.331 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:48 compute-0 podman[214015]: 2025-12-05 06:30:48.471406089 +0000 UTC m=+0.042152846 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 05 06:30:48 compute-0 podman[214013]: 2025-12-05 06:30:48.487406574 +0000 UTC m=+0.066499364 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:30:48 compute-0 podman[214014]: 2025-12-05 06:30:48.50338722 +0000 UTC m=+0.075212687 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:30:48 compute-0 nova_compute[186329]: 2025-12-05 06:30:48.954 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:51 compute-0 nova_compute[186329]: 2025-12-05 06:30:51.333 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:53 compute-0 nova_compute[186329]: 2025-12-05 06:30:53.955 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:56 compute-0 nova_compute[186329]: 2025-12-05 06:30:56.335 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:58 compute-0 nova_compute[186329]: 2025-12-05 06:30:58.957 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:30:59 compute-0 nova_compute[186329]: 2025-12-05 06:30:59.214 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:30:59 compute-0 nova_compute[186329]: 2025-12-05 06:30:59.727 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:30:59 compute-0 nova_compute[186329]: 2025-12-05 06:30:59.727 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:30:59 compute-0 nova_compute[186329]: 2025-12-05 06:30:59.727 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:30:59 compute-0 nova_compute[186329]: 2025-12-05 06:30:59.728 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:30:59 compute-0 podman[196599]: time="2025-12-05T06:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:30:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:30:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.752 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.792 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.793 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.832 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.835 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.878 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.878 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:00 compute-0 nova_compute[186329]: 2025-12-05 06:31:00.918 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.102 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.103 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.116 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.117 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5515MB free_disk=73.10911560058594GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.117 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.117 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:01 compute-0 nova_compute[186329]: 2025-12-05 06:31:01.337 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: ERROR   06:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: ERROR   06:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: ERROR   06:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: ERROR   06:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: ERROR   06:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:31:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:31:02 compute-0 nova_compute[186329]: 2025-12-05 06:31:02.704 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:31:02 compute-0 nova_compute[186329]: 2025-12-05 06:31:02.705 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:31:02 compute-0 nova_compute[186329]: 2025-12-05 06:31:02.705 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:31:01 up  1:08,  0 user,  load average: 0.04, 0.10, 0.19\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:31:02 compute-0 nova_compute[186329]: 2025-12-05 06:31:02.801 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:31:03 compute-0 nova_compute[186329]: 2025-12-05 06:31:03.304 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:31:03 compute-0 nova_compute[186329]: 2025-12-05 06:31:03.808 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:31:03 compute-0 nova_compute[186329]: 2025-12-05 06:31:03.808 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.691s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:03 compute-0 nova_compute[186329]: 2025-12-05 06:31:03.959 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.299 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.299 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.758 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Creating tmpfile /var/lib/nova/instances/tmpr9_xm_lb to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.759 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.761 186333 DEBUG nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr9_xm_lb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.794 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Creating tmpfile /var/lib/nova/instances/tmpx84c5lud to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.794 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.796 186333 DEBUG nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx84c5lud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.803 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.803 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.803 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.803 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.803 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:04 compute-0 nova_compute[186329]: 2025-12-05 06:31:04.804 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:31:05 compute-0 nova_compute[186329]: 2025-12-05 06:31:05.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:06 compute-0 nova_compute[186329]: 2025-12-05 06:31:06.339 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:06 compute-0 nova_compute[186329]: 2025-12-05 06:31:06.781 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:06 compute-0 nova_compute[186329]: 2025-12-05 06:31:06.812 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:08 compute-0 podman[214077]: 2025-12-05 06:31:08.455551504 +0000 UTC m=+0.039809061 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:31:08 compute-0 podman[214076]: 2025-12-05 06:31:08.479464495 +0000 UTC m=+0.066031564 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:31:08 compute-0 nova_compute[186329]: 2025-12-05 06:31:08.960 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:10 compute-0 nova_compute[186329]: 2025-12-05 06:31:10.673 186333 DEBUG nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr9_xm_lb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8378e152-c0bc-4b50-889d-5ed87ecd729c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:31:11 compute-0 nova_compute[186329]: 2025-12-05 06:31:11.340 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:11 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 06:31:11 compute-0 nova_compute[186329]: 2025-12-05 06:31:11.683 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:31:11 compute-0 nova_compute[186329]: 2025-12-05 06:31:11.683 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:31:11 compute-0 nova_compute[186329]: 2025-12-05 06:31:11.683 186333 DEBUG nova.network.neutron [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:31:12 compute-0 nova_compute[186329]: 2025-12-05 06:31:12.187 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:12 compute-0 nova_compute[186329]: 2025-12-05 06:31:12.892 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.004 186333 DEBUG nova.network.neutron [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Updating instance_info_cache with network_info: [{"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.510 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.518 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr9_xm_lb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8378e152-c0bc-4b50-889d-5ed87ecd729c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.518 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Creating instance directory: /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.518 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Creating disk.info with the contents: {'/var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk': 'qcow2', '/var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.519 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.519 186333 DEBUG nova.objects.instance [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8378e152-c0bc-4b50-889d-5ed87ecd729c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:31:13 compute-0 nova_compute[186329]: 2025-12-05 06:31:13.961 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.023 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.025 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.026 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.066 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.067 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.067 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.067 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.070 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.070 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.108 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.109 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.126 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk 1073741824" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.126 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.127 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.168 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.169 186333 DEBUG nova.virt.disk.api [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.169 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.210 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.211 186333 DEBUG nova.virt.disk.api [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.211 186333 DEBUG nova.objects.instance [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid 8378e152-c0bc-4b50-889d-5ed87ecd729c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.716 186333 DEBUG nova.objects.base [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<8378e152-c0bc-4b50-889d-5ed87ecd729c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.716 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.733 186333 DEBUG oslo_concurrency.processutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk.config 497664" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.733 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.734 186333 DEBUG nova.virt.libvirt.vif [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1236199558',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1236199',id=22,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:30:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-1zmw8e93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:30:25Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=8378e152-c0bc-4b50-889d-5ed87ecd729c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.735 186333 DEBUG nova.network.os_vif_util [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.735 186333 DEBUG nova.network.os_vif_util [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.736 186333 DEBUG os_vif [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.736 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.737 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.737 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.738 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.738 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '373e5be3-478c-5d9b-aa77-23387a05ced3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.739 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.741 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.743 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.743 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9eb9012b-dc, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.744 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9eb9012b-dc, col_values=(('qos', UUID('4b062bbb-e1ae-4267-bc32-44a6731dae2f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.744 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9eb9012b-dc, col_values=(('external_ids', {'iface-id': '9eb9012b-dce5-48ed-afc1-5fccc0654e2e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5a:41:f7', 'vm-uuid': '8378e152-c0bc-4b50-889d-5ed87ecd729c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.744 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 NetworkManager[55434]: <info>  [1764916274.7454] manager: (tap9eb9012b-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.747 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.748 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.749 186333 INFO os_vif [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc')
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.749 186333 DEBUG nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.749 186333 DEBUG nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr9_xm_lb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8378e152-c0bc-4b50-889d-5ed87ecd729c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:31:14 compute-0 nova_compute[186329]: 2025-12-05 06:31:14.750 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:15 compute-0 nova_compute[186329]: 2025-12-05 06:31:15.031 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:15 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:15.206 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:31:15 compute-0 nova_compute[186329]: 2025-12-05 06:31:15.206 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:15 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:15.207 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:31:15 compute-0 nova_compute[186329]: 2025-12-05 06:31:15.459 186333 DEBUG nova.network.neutron [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Port 9eb9012b-dce5-48ed-afc1-5fccc0654e2e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:31:15 compute-0 nova_compute[186329]: 2025-12-05 06:31:15.466 186333 DEBUG nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpr9_xm_lb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8378e152-c0bc-4b50-889d-5ed87ecd729c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:31:17 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:17.208 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:18 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 05 06:31:18 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 05 06:31:18 compute-0 podman[214144]: 2025-12-05 06:31:18.880943977 +0000 UTC m=+0.041824711 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 05 06:31:18 compute-0 podman[214146]: 2025-12-05 06:31:18.897445122 +0000 UTC m=+0.054034568 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:31:18 compute-0 podman[214145]: 2025-12-05 06:31:18.90935229 +0000 UTC m=+0.069865663 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 05 06:31:18 compute-0 kernel: tap9eb9012b-dc: entered promiscuous mode
Dec 05 06:31:18 compute-0 ovn_controller[95223]: 2025-12-05T06:31:18Z|00186|binding|INFO|Claiming lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e for this additional chassis.
Dec 05 06:31:18 compute-0 ovn_controller[95223]: 2025-12-05T06:31:18Z|00187|binding|INFO|9eb9012b-dce5-48ed-afc1-5fccc0654e2e: Claiming fa:16:3e:5a:41:f7 10.100.0.8
Dec 05 06:31:18 compute-0 nova_compute[186329]: 2025-12-05 06:31:18.950 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:18 compute-0 nova_compute[186329]: 2025-12-05 06:31:18.953 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:18 compute-0 NetworkManager[55434]: <info>  [1764916278.9551] manager: (tap9eb9012b-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.961 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:41:f7 10.100.0.8'], port_security=['fa:16:3e:5a:41:f7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8378e152-c0bc-4b50-889d-5ed87ecd729c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9eb9012b-dce5-48ed-afc1-5fccc0654e2e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.965 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 9eb9012b-dce5-48ed-afc1-5fccc0654e2e in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.966 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:31:18 compute-0 systemd-udevd[214231]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.975 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d36b5df2-4cb0-4fc8-9458-c848e155f43a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.976 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1b8634d-61 in ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.977 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1b8634d-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.977 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0aca8342-223e-4494-954e-8329083a4fab]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.978 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[94c91981-25d6-4cf8-a344-3f9fe5ddaf3e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:18 compute-0 systemd-machined[152967]: New machine qemu-17-instance-00000016.
Dec 05 06:31:18 compute-0 NetworkManager[55434]: <info>  [1764916278.9888] device (tap9eb9012b-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:31:18 compute-0 NetworkManager[55434]: <info>  [1764916278.9894] device (tap9eb9012b-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:31:18 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:18.988 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa57998-a1c2-4669-92c6-6a89e2b5331a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.012 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b18d4c48-1c99-43a3-8d57-fa661637cb8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.012 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:19 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000016.
Dec 05 06:31:19 compute-0 ovn_controller[95223]: 2025-12-05T06:31:19Z|00188|binding|INFO|Setting lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e ovn-installed in OVS
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.017 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.032 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[fea62776-a3d4-4d63-9032-426ada6ad9ea]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 NetworkManager[55434]: <info>  [1764916279.0376] manager: (tapb1b8634d-60): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.038 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2715e858-8823-4ebe-b52f-ff69e6d44e4d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.058 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[43c6f3e9-fad1-427e-98b3-c2aacf79f7b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.060 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[d48baf0d-695f-4e4e-af80-2ff3bbb0e1ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 NetworkManager[55434]: <info>  [1764916279.0768] device (tapb1b8634d-60): carrier: link connected
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.081 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[57ac0088-a495-4faa-bb9a-81875ffa8875]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.094 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3e3a2b4e-2187-4bea-b676-dfd156591258]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415761, 'reachable_time': 28164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.106 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9115a9bb-bb29-4f20-97ed-f298eb8acb42]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:d262'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415761, 'tstamp': 415761}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214258, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.119 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5a450c74-0de3-4722-971b-beaf2418661a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415761, 'reachable_time': 28164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214259, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.138 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad6fe93-6e82-4188-9d66-65fbbbb8fa89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.181 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[555f2001-8f3b-4cea-8c19-2086ea6a7ac0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.181 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.182 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.182 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:19 compute-0 NetworkManager[55434]: <info>  [1764916279.1839] manager: (tapb1b8634d-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.183 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:19 compute-0 kernel: tapb1b8634d-60: entered promiscuous mode
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.185 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:19 compute-0 ovn_controller[95223]: 2025-12-05T06:31:19Z|00189|binding|INFO|Releasing lport d9989e5b-035f-46c9-8e0e-5fd96cf594af from this chassis (sb_readonly=0)
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.186 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.198 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.198 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[29aec216-20ee-4009-bbaa-479f18662158]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.199 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.199 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.199 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.199 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.199 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c27fa22b-8c60-43d1-9eeb-eedd28592cd2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.200 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.200 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[eed9a2d2-c5d9-41d9-b902-35acb330cce9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.200 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:31:19 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:19.201 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'env', 'PROCESS_TAG=haproxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:31:19 compute-0 podman[214310]: 2025-12-05 06:31:19.5169173 +0000 UTC m=+0.033368833 container create 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202)
Dec 05 06:31:19 compute-0 systemd[1]: Started libpod-conmon-5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86.scope.
Dec 05 06:31:19 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:31:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fa3d194a6e419654ab1c71c709f81fbf4c54cdde58d649c4d9b2b43e2dc652e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:31:19 compute-0 podman[214310]: 2025-12-05 06:31:19.592209505 +0000 UTC m=+0.108661057 container init 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 05 06:31:19 compute-0 podman[214310]: 2025-12-05 06:31:19.596622412 +0000 UTC m=+0.113073954 container start 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:31:19 compute-0 podman[214310]: 2025-12-05 06:31:19.501471779 +0000 UTC m=+0.017923331 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:31:19 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [NOTICE]   (214327) : New worker (214329) forked
Dec 05 06:31:19 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [NOTICE]   (214327) : Loading success.
Dec 05 06:31:19 compute-0 nova_compute[186329]: 2025-12-05 06:31:19.745 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:21 compute-0 ovn_controller[95223]: 2025-12-05T06:31:21Z|00190|binding|INFO|Claiming lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e for this chassis.
Dec 05 06:31:21 compute-0 ovn_controller[95223]: 2025-12-05T06:31:21Z|00191|binding|INFO|9eb9012b-dce5-48ed-afc1-5fccc0654e2e: Claiming fa:16:3e:5a:41:f7 10.100.0.8
Dec 05 06:31:21 compute-0 ovn_controller[95223]: 2025-12-05T06:31:21Z|00192|binding|INFO|Setting lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e up in Southbound
Dec 05 06:31:23 compute-0 nova_compute[186329]: 2025-12-05 06:31:23.661 186333 INFO nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Post operation of migration started
Dec 05 06:31:23 compute-0 nova_compute[186329]: 2025-12-05 06:31:23.662 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.016 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.651 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.652 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.732 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.732 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.733 186333 DEBUG nova.network.neutron [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:31:24 compute-0 nova_compute[186329]: 2025-12-05 06:31:24.748 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:25 compute-0 nova_compute[186329]: 2025-12-05 06:31:25.236 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:25 compute-0 nova_compute[186329]: 2025-12-05 06:31:25.884 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:25 compute-0 nova_compute[186329]: 2025-12-05 06:31:25.993 186333 DEBUG nova.network.neutron [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Updating instance_info_cache with network_info: [{"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:31:26 compute-0 nova_compute[186329]: 2025-12-05 06:31:26.497 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-8378e152-c0bc-4b50-889d-5ed87ecd729c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:31:27 compute-0 nova_compute[186329]: 2025-12-05 06:31:27.008 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:27 compute-0 nova_compute[186329]: 2025-12-05 06:31:27.009 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:27 compute-0 nova_compute[186329]: 2025-12-05 06:31:27.009 186333 DEBUG oslo_concurrency.lockutils [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:27 compute-0 nova_compute[186329]: 2025-12-05 06:31:27.012 186333 INFO nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:31:27 compute-0 virtqemud[186605]: Domain id=17 name='instance-00000016' uuid=8378e152-c0bc-4b50-889d-5ed87ecd729c is tainted: custom-monitor
Dec 05 06:31:28 compute-0 nova_compute[186329]: 2025-12-05 06:31:28.018 186333 INFO nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:31:29 compute-0 nova_compute[186329]: 2025-12-05 06:31:29.018 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:29 compute-0 nova_compute[186329]: 2025-12-05 06:31:29.021 186333 INFO nova.virt.libvirt.driver [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:31:29 compute-0 nova_compute[186329]: 2025-12-05 06:31:29.024 186333 DEBUG nova.compute.manager [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:31:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:29.518 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:29.518 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:29.519 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:29 compute-0 nova_compute[186329]: 2025-12-05 06:31:29.530 186333 DEBUG nova.objects.instance [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:31:29 compute-0 podman[196599]: time="2025-12-05T06:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:31:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:31:29 compute-0 nova_compute[186329]: 2025-12-05 06:31:29.748 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Dec 05 06:31:30 compute-0 nova_compute[186329]: 2025-12-05 06:31:30.541 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:30 compute-0 nova_compute[186329]: 2025-12-05 06:31:30.865 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:30 compute-0 nova_compute[186329]: 2025-12-05 06:31:30.865 186333 WARNING neutronclient.v2_0.client [None req-20100ce1-2b58-4423-b20a-745626fe9b10 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: ERROR   06:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: ERROR   06:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: ERROR   06:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: ERROR   06:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: ERROR   06:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:31:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:31:34 compute-0 nova_compute[186329]: 2025-12-05 06:31:34.019 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:34 compute-0 nova_compute[186329]: 2025-12-05 06:31:34.749 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:39 compute-0 nova_compute[186329]: 2025-12-05 06:31:39.020 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:39 compute-0 podman[214337]: 2025-12-05 06:31:39.458458491 +0000 UTC m=+0.042386426 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:31:39 compute-0 podman[214336]: 2025-12-05 06:31:39.482575356 +0000 UTC m=+0.067140359 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:31:39 compute-0 nova_compute[186329]: 2025-12-05 06:31:39.751 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:39 compute-0 nova_compute[186329]: 2025-12-05 06:31:39.865 186333 DEBUG nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx84c5lud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd3b9204-cbe1-458d-b7eb-44c824804e0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:31:40 compute-0 nova_compute[186329]: 2025-12-05 06:31:40.873 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:31:40 compute-0 nova_compute[186329]: 2025-12-05 06:31:40.874 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:31:40 compute-0 nova_compute[186329]: 2025-12-05 06:31:40.874 186333 DEBUG nova.network.neutron [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:31:41 compute-0 nova_compute[186329]: 2025-12-05 06:31:41.378 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:41 compute-0 nova_compute[186329]: 2025-12-05 06:31:41.826 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:41 compute-0 nova_compute[186329]: 2025-12-05 06:31:41.968 186333 DEBUG nova.network.neutron [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Updating instance_info_cache with network_info: [{"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.474 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.482 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx84c5lud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd3b9204-cbe1-458d-b7eb-44c824804e0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.483 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Creating instance directory: /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.483 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Creating disk.info with the contents: {'/var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk': 'qcow2', '/var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.484 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.484 186333 DEBUG nova.objects.instance [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid dd3b9204-cbe1-458d-b7eb-44c824804e0c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.988 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.991 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:42 compute-0 nova_compute[186329]: 2025-12-05 06:31:42.992 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.033 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.034 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.034 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.035 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.037 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.038 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.077 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.078 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.097 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk 1073741824" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.098 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.098 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.138 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.139 186333 DEBUG nova.virt.disk.api [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.139 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.180 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.180 186333 DEBUG nova.virt.disk.api [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.181 186333 DEBUG nova.objects.instance [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid dd3b9204-cbe1-458d-b7eb-44c824804e0c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.685 186333 DEBUG nova.objects.base [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<dd3b9204-cbe1-458d-b7eb-44c824804e0c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.686 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.702 186333 DEBUG oslo_concurrency.processutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c/disk.config 497664" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.703 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.704 186333 DEBUG nova.virt.libvirt.vif [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:30:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-929815684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-9298156',id=23,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:30:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-wxriv14v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:30:45Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=dd3b9204-cbe1-458d-b7eb-44c824804e0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.704 186333 DEBUG nova.network.os_vif_util [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.705 186333 DEBUG nova.network.os_vif_util [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.705 186333 DEBUG os_vif [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.707 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.707 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.708 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.708 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.709 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '86a10b69-6cb3-5ee4-b7c0-0eb224512340', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.710 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.712 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.714 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.714 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc431f94-2f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.714 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcc431f94-2f, col_values=(('qos', UUID('2182cb92-1374-49a0-9464-3cdccc84c9eb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.715 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcc431f94-2f, col_values=(('external_ids', {'iface-id': 'cc431f94-2fd5-406f-bccb-182c23440f47', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:04:8f', 'vm-uuid': 'dd3b9204-cbe1-458d-b7eb-44c824804e0c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.716 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 NetworkManager[55434]: <info>  [1764916303.7167] manager: (tapcc431f94-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.720 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.721 186333 INFO os_vif [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f')
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.721 186333 DEBUG nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.721 186333 DEBUG nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx84c5lud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd3b9204-cbe1-458d-b7eb-44c824804e0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:31:43 compute-0 nova_compute[186329]: 2025-12-05 06:31:43.722 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:44 compute-0 nova_compute[186329]: 2025-12-05 06:31:44.021 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:44 compute-0 nova_compute[186329]: 2025-12-05 06:31:44.623 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:45 compute-0 nova_compute[186329]: 2025-12-05 06:31:45.462 186333 DEBUG nova.network.neutron [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Port cc431f94-2fd5-406f-bccb-182c23440f47 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:31:45 compute-0 nova_compute[186329]: 2025-12-05 06:31:45.469 186333 DEBUG nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx84c5lud',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd3b9204-cbe1-458d-b7eb-44c824804e0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:31:48 compute-0 kernel: tapcc431f94-2f: entered promiscuous mode
Dec 05 06:31:48 compute-0 NetworkManager[55434]: <info>  [1764916308.5155] manager: (tapcc431f94-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec 05 06:31:48 compute-0 ovn_controller[95223]: 2025-12-05T06:31:48Z|00193|binding|INFO|Claiming lport cc431f94-2fd5-406f-bccb-182c23440f47 for this additional chassis.
Dec 05 06:31:48 compute-0 ovn_controller[95223]: 2025-12-05T06:31:48Z|00194|binding|INFO|cc431f94-2fd5-406f-bccb-182c23440f47: Claiming fa:16:3e:83:04:8f 10.100.0.12
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.517 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.531 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:04:8f 10.100.0.12'], port_security=['fa:16:3e:83:04:8f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'dd3b9204-cbe1-458d-b7eb-44c824804e0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=cc431f94-2fd5-406f-bccb-182c23440f47) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.532 104041 INFO neutron.agent.ovn.metadata.agent [-] Port cc431f94-2fd5-406f-bccb-182c23440f47 in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.533 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.533 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:48 compute-0 ovn_controller[95223]: 2025-12-05T06:31:48Z|00195|binding|INFO|Setting lport cc431f94-2fd5-406f-bccb-182c23440f47 ovn-installed in OVS
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.536 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.538 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:48 compute-0 systemd-udevd[214414]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.544 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe5e7fa-0f6f-4c0b-b1c1-1542afdf5d7e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 systemd-machined[152967]: New machine qemu-18-instance-00000017.
Dec 05 06:31:48 compute-0 NetworkManager[55434]: <info>  [1764916308.5572] device (tapcc431f94-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:31:48 compute-0 NetworkManager[55434]: <info>  [1764916308.5585] device (tapcc431f94-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:31:48 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000017.
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.565 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[270bcaa0-db34-464e-ab19-fa9f049ced69]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.566 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a51837-8224-4c3e-8a70-7312dd1c9918]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.585 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[fda05400-257a-438a-a2ca-548e1f7091b8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.601 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4b062d-997a-48b9-afae-3d22bf288b5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415761, 'reachable_time': 28164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214427, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.613 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7fd12353-09fb-43e4-9c09-38a11ded9c39]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415769, 'tstamp': 415769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214430, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415771, 'tstamp': 415771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214430, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.613 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.615 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.617 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.617 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.617 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.618 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:31:48 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:31:48.619 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e26dc3-ddea-46ae-a958-93413e314e95]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:31:48 compute-0 nova_compute[186329]: 2025-12-05 06:31:48.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:49 compute-0 nova_compute[186329]: 2025-12-05 06:31:49.022 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:49 compute-0 podman[214441]: 2025-12-05 06:31:49.474898099 +0000 UTC m=+0.056581756 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec 05 06:31:49 compute-0 podman[214439]: 2025-12-05 06:31:49.492014511 +0000 UTC m=+0.078638549 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4)
Dec 05 06:31:49 compute-0 podman[214440]: 2025-12-05 06:31:49.504875051 +0000 UTC m=+0.086632147 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal)
Dec 05 06:31:51 compute-0 ovn_controller[95223]: 2025-12-05T06:31:51Z|00196|binding|INFO|Claiming lport cc431f94-2fd5-406f-bccb-182c23440f47 for this chassis.
Dec 05 06:31:51 compute-0 ovn_controller[95223]: 2025-12-05T06:31:51Z|00197|binding|INFO|cc431f94-2fd5-406f-bccb-182c23440f47: Claiming fa:16:3e:83:04:8f 10.100.0.12
Dec 05 06:31:51 compute-0 ovn_controller[95223]: 2025-12-05T06:31:51Z|00198|binding|INFO|Setting lport cc431f94-2fd5-406f-bccb-182c23440f47 up in Southbound
Dec 05 06:31:52 compute-0 nova_compute[186329]: 2025-12-05 06:31:52.917 186333 INFO nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Post operation of migration started
Dec 05 06:31:52 compute-0 nova_compute[186329]: 2025-12-05 06:31:52.917 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:52 compute-0 nova_compute[186329]: 2025-12-05 06:31:52.998 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:52 compute-0 nova_compute[186329]: 2025-12-05 06:31:52.999 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.058 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.059 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.059 186333 DEBUG nova.network.neutron [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.562 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.717 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.835 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:53 compute-0 nova_compute[186329]: 2025-12-05 06:31:53.955 186333 DEBUG nova.network.neutron [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Updating instance_info_cache with network_info: [{"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.023 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.459 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-dd3b9204-cbe1-458d-b7eb-44c824804e0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.969 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.969 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.969 186333 DEBUG oslo_concurrency.lockutils [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:31:54 compute-0 nova_compute[186329]: 2025-12-05 06:31:54.973 186333 INFO nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:31:54 compute-0 virtqemud[186605]: Domain id=18 name='instance-00000017' uuid=dd3b9204-cbe1-458d-b7eb-44c824804e0c is tainted: custom-monitor
Dec 05 06:31:55 compute-0 nova_compute[186329]: 2025-12-05 06:31:55.978 186333 INFO nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:31:56 compute-0 nova_compute[186329]: 2025-12-05 06:31:56.982 186333 INFO nova.virt.libvirt.driver [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:31:56 compute-0 nova_compute[186329]: 2025-12-05 06:31:56.985 186333 DEBUG nova.compute.manager [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:31:57 compute-0 nova_compute[186329]: 2025-12-05 06:31:57.491 186333 DEBUG nova.objects.instance [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:31:58 compute-0 nova_compute[186329]: 2025-12-05 06:31:58.503 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:58 compute-0 nova_compute[186329]: 2025-12-05 06:31:58.624 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:58 compute-0 nova_compute[186329]: 2025-12-05 06:31:58.625 186333 WARNING neutronclient.v2_0.client [None req-fd69c3a1-3240-40fc-ae82-b6f35ab367d3 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:31:58 compute-0 nova_compute[186329]: 2025-12-05 06:31:58.719 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:59 compute-0 nova_compute[186329]: 2025-12-05 06:31:59.024 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:31:59 compute-0 nova_compute[186329]: 2025-12-05 06:31:59.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:59 compute-0 nova_compute[186329]: 2025-12-05 06:31:59.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:31:59 compute-0 podman[196599]: time="2025-12-05T06:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:31:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:31:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.403 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.404 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.404 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.404 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.405 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.411 186333 INFO nova.compute.manager [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Terminating instance
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.923 186333 DEBUG nova.compute.manager [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:32:00 compute-0 kernel: tapcc431f94-2f (unregistering): left promiscuous mode
Dec 05 06:32:00 compute-0 NetworkManager[55434]: <info>  [1764916320.9496] device (tapcc431f94-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:32:00 compute-0 ovn_controller[95223]: 2025-12-05T06:32:00Z|00199|binding|INFO|Releasing lport cc431f94-2fd5-406f-bccb-182c23440f47 from this chassis (sb_readonly=0)
Dec 05 06:32:00 compute-0 ovn_controller[95223]: 2025-12-05T06:32:00Z|00200|binding|INFO|Setting lport cc431f94-2fd5-406f-bccb-182c23440f47 down in Southbound
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.961 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:00 compute-0 ovn_controller[95223]: 2025-12-05T06:32:00Z|00201|binding|INFO|Removing iface tapcc431f94-2f ovn-installed in OVS
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.962 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:00.974 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:04:8f 10.100.0.12'], port_security=['fa:16:3e:83:04:8f 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'dd3b9204-cbe1-458d-b7eb-44c824804e0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=cc431f94-2fd5-406f-bccb-182c23440f47) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:32:00 compute-0 nova_compute[186329]: 2025-12-05 06:32:00.976 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:00.975 104041 INFO neutron.agent.ovn.metadata.agent [-] Port cc431f94-2fd5-406f-bccb-182c23440f47 in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:32:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:00.976 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:32:00 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:00.987 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[00f361c5-95ee-426b-9489-bd3ddec805d4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:00 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec 05 06:32:00 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Consumed 2.356s CPU time.
Dec 05 06:32:00 compute-0 systemd-machined[152967]: Machine qemu-18-instance-00000017 terminated.
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.008 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[a1342bec-f02e-44ae-b7dd-1baf62f82901]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.010 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[53f845a3-a029-49eb-bc17-de8d4da106ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.028 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[81c7452f-0c31-4501-98cc-26ca5cfddf0f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.040 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e1345241-224f-4783-b1aa-e0e5a7141c20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415761, 'reachable_time': 28164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214516, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.049 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2604a4b3-ca86-45e3-878d-24e56d876b68]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415769, 'tstamp': 415769}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214517, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415771, 'tstamp': 415771}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214517, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.050 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.051 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.054 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.054 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.055 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.055 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.055 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:32:01 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:01.056 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0371e417-a643-49fe-9735-7d22b692d36e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.079 186333 DEBUG nova.compute.manager [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Received event network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.080 186333 DEBUG oslo_concurrency.lockutils [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.080 186333 DEBUG oslo_concurrency.lockutils [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.080 186333 DEBUG oslo_concurrency.lockutils [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.080 186333 DEBUG nova.compute.manager [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] No waiting events found dispatching network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.080 186333 DEBUG nova.compute.manager [req-888bb546-af42-46a1-9009-e648aaee00eb req-6b2df1d9-5a23-49ed-ab36-fa790b7422fc fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Received event network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.161 186333 INFO nova.virt.libvirt.driver [-] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Instance destroyed successfully.
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.161 186333 DEBUG nova.objects.instance [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lazy-loading 'resources' on Instance uuid dd3b9204-cbe1-458d-b7eb-44c824804e0c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.218 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: ERROR   06:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: ERROR   06:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: ERROR   06:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: ERROR   06:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: ERROR   06:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:32:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.666 186333 DEBUG nova.virt.libvirt.vif [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:30:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-929815684',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-9298156',id=23,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:30:45Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-wxriv14v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:31:58Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=dd3b9204-cbe1-458d-b7eb-44c824804e0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.666 186333 DEBUG nova.network.os_vif_util [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converting VIF {"id": "cc431f94-2fd5-406f-bccb-182c23440f47", "address": "fa:16:3e:83:04:8f", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc431f94-2f", "ovs_interfaceid": "cc431f94-2fd5-406f-bccb-182c23440f47", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.666 186333 DEBUG nova.network.os_vif_util [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.667 186333 DEBUG os_vif [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.668 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.668 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc431f94-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.670 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.672 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.672 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2182cb92-1374-49a0-9464-3cdccc84c9eb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.673 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.673 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.675 186333 INFO os_vif [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:04:8f,bridge_name='br-int',has_traffic_filtering=True,id=cc431f94-2fd5-406f-bccb-182c23440f47,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc431f94-2f')
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.675 186333 INFO nova.virt.libvirt.driver [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Deleting instance files /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c_del
Dec 05 06:32:01 compute-0 nova_compute[186329]: 2025-12-05 06:32:01.676 186333 INFO nova.virt.libvirt.driver [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Deletion of /var/lib/nova/instances/dd3b9204-cbe1-458d-b7eb-44c824804e0c_del complete
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.182 186333 INFO nova.compute.manager [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Took 1.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.183 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.183 186333 DEBUG nova.compute.manager [-] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.183 186333 DEBUG nova.network.neutron [-] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.183 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.288 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.289 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.335 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.339 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.389 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.390 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.429 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c/disk --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.433 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.473 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.474 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.513 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.514 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Error from libvirt while getting description of instance-00000017: [Error Code 42] Domain not found: no domain with matching uuid 'dd3b9204-cbe1-458d-b7eb-44c824804e0c' (instance-00000017): libvirt.libvirtError: Domain not found: no domain with matching uuid 'dd3b9204-cbe1-458d-b7eb-44c824804e0c' (instance-00000017)
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.637 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.726 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.727 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.740 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.741 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5317MB free_disk=73.05031204223633GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.741 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.741 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.880 186333 DEBUG nova.compute.manager [req-b21abed7-b38a-4410-90ed-8bb33d2602f9 req-48276656-d554-45dc-86f3-ae43f86e2b19 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Received event network-vif-deleted-cc431f94-2fd5-406f-bccb-182c23440f47 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.880 186333 INFO nova.compute.manager [req-b21abed7-b38a-4410-90ed-8bb33d2602f9 req-48276656-d554-45dc-86f3-ae43f86e2b19 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Neutron deleted interface cc431f94-2fd5-406f-bccb-182c23440f47; detaching it from the instance and deleting it from the info cache
Dec 05 06:32:02 compute-0 nova_compute[186329]: 2025-12-05 06:32:02.881 186333 DEBUG nova.network.neutron [req-b21abed7-b38a-4410-90ed-8bb33d2602f9 req-48276656-d554-45dc-86f3-ae43f86e2b19 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.129 186333 DEBUG nova.compute.manager [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Received event network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.129 186333 DEBUG oslo_concurrency.lockutils [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.129 186333 DEBUG oslo_concurrency.lockutils [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.130 186333 DEBUG oslo_concurrency.lockutils [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.130 186333 DEBUG nova.compute.manager [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] No waiting events found dispatching network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.130 186333 DEBUG nova.compute.manager [req-166afec3-e7d4-408f-905a-3dd8dac03698 req-6ce9e720-de55-4cbb-9b23-e802638a84a0 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Received event network-vif-unplugged-cc431f94-2fd5-406f-bccb-182c23440f47 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.335 186333 DEBUG nova.network.neutron [-] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.385 186333 DEBUG nova.compute.manager [req-b21abed7-b38a-4410-90ed-8bb33d2602f9 req-48276656-d554-45dc-86f3-ae43f86e2b19 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Detach interface failed, port_id=cc431f94-2fd5-406f-bccb-182c23440f47, reason: Instance dd3b9204-cbe1-458d-b7eb-44c824804e0c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11650
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.752 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Applying migration context for instance dd3b9204-cbe1-458d-b7eb-44c824804e0c as it has an incoming, in-progress migration 56618304-30eb-4265-93dd-a1a308603677. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.752 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:32:03 compute-0 nova_compute[186329]: 2025-12-05 06:32:03.838 186333 INFO nova.compute.manager [-] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Took 1.65 seconds to deallocate network for instance.
Dec 05 06:32:04 compute-0 nova_compute[186329]: 2025-12-05 06:32:04.026 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:04 compute-0 nova_compute[186329]: 2025-12-05 06:32:04.351 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:04 compute-0 nova_compute[186329]: 2025-12-05 06:32:04.762 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: dd3b9204-cbe1-458d-b7eb-44c824804e0c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.286 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.286 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance dd3b9204-cbe1-458d-b7eb-44c824804e0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.287 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 8378e152-c0bc-4b50-889d-5ed87ecd729c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.287 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.287 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=4 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:32:02 up  1:10,  0 user,  load average: 0.14, 0.11, 0.19\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_3c9c02fc0c8641c48e6ccfd619fde68d': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.338 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:32:05 compute-0 nova_compute[186329]: 2025-12-05 06:32:05.844 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.351 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.351 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.610s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.351 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 2.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.394 186333 DEBUG nova.compute.provider_tree [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.674 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:06 compute-0 nova_compute[186329]: 2025-12-05 06:32:06.899 186333 DEBUG nova.scheduler.client.report [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:32:07 compute-0 nova_compute[186329]: 2025-12-05 06:32:07.405 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.053s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:07 compute-0 nova_compute[186329]: 2025-12-05 06:32:07.420 186333 INFO nova.scheduler.client.report [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Deleted allocations for instance dd3b9204-cbe1-458d-b7eb-44c824804e0c
Dec 05 06:32:08 compute-0 nova_compute[186329]: 2025-12-05 06:32:08.445 186333 DEBUG oslo_concurrency.lockutils [None req-33ee5a57-0851-4688-aecb-f9fda857c30d 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "dd3b9204-cbe1-458d-b7eb-44c824804e0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.041s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:09 compute-0 nova_compute[186329]: 2025-12-05 06:32:09.027 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:09 compute-0 nova_compute[186329]: 2025-12-05 06:32:09.352 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:09 compute-0 nova_compute[186329]: 2025-12-05 06:32:09.352 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:09 compute-0 nova_compute[186329]: 2025-12-05 06:32:09.352 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:09 compute-0 nova_compute[186329]: 2025-12-05 06:32:09.352 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.314 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "8378e152-c0bc-4b50-889d-5ed87ecd729c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.314 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.315 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.315 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.315 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.322 186333 INFO nova.compute.manager [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Terminating instance
Dec 05 06:32:10 compute-0 podman[214559]: 2025-12-05 06:32:10.458543995 +0000 UTC m=+0.042414916 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:32:10 compute-0 podman[214558]: 2025-12-05 06:32:10.476730709 +0000 UTC m=+0.062342623 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.829 186333 DEBUG nova.compute.manager [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:32:10 compute-0 kernel: tap9eb9012b-dc (unregistering): left promiscuous mode
Dec 05 06:32:10 compute-0 NetworkManager[55434]: <info>  [1764916330.8613] device (tap9eb9012b-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:32:10 compute-0 ovn_controller[95223]: 2025-12-05T06:32:10Z|00202|binding|INFO|Releasing lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e from this chassis (sb_readonly=0)
Dec 05 06:32:10 compute-0 ovn_controller[95223]: 2025-12-05T06:32:10Z|00203|binding|INFO|Setting lport 9eb9012b-dce5-48ed-afc1-5fccc0654e2e down in Southbound
Dec 05 06:32:10 compute-0 ovn_controller[95223]: 2025-12-05T06:32:10Z|00204|binding|INFO|Removing iface tap9eb9012b-dc ovn-installed in OVS
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.866 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:10.869 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5a:41:f7 10.100.0.8'], port_security=['fa:16:3e:5a:41:f7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '8378e152-c0bc-4b50-889d-5ed87ecd729c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=9eb9012b-dce5-48ed-afc1-5fccc0654e2e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:32:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:10.869 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 9eb9012b-dce5-48ed-afc1-5fccc0654e2e in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:32:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:10.870 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:32:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:10.871 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a4bb9503-b81b-404d-92c5-f893836aabdf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:10 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:10.872 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 namespace which is not needed anymore
Dec 05 06:32:10 compute-0 nova_compute[186329]: 2025-12-05 06:32:10.886 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:10 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 05 06:32:10 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Consumed 2.310s CPU time.
Dec 05 06:32:10 compute-0 systemd-machined[152967]: Machine qemu-17-instance-00000016 terminated.
Dec 05 06:32:10 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [NOTICE]   (214327) : haproxy version is 3.0.5-8e879a5
Dec 05 06:32:10 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [NOTICE]   (214327) : path to executable is /usr/sbin/haproxy
Dec 05 06:32:10 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [WARNING]  (214327) : Exiting Master process...
Dec 05 06:32:10 compute-0 podman[214625]: 2025-12-05 06:32:10.968928304 +0000 UTC m=+0.021664944 container kill 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:32:10 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [ALERT]    (214327) : Current worker (214329) exited with code 143 (Terminated)
Dec 05 06:32:10 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[214323]: [WARNING]  (214327) : All workers exited. Exiting... (0)
Dec 05 06:32:10 compute-0 systemd[1]: libpod-5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86.scope: Deactivated successfully.
Dec 05 06:32:11 compute-0 podman[214637]: 2025-12-05 06:32:11.001940791 +0000 UTC m=+0.016860180 container died 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 05 06:32:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86-userdata-shm.mount: Deactivated successfully.
Dec 05 06:32:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-8fa3d194a6e419654ab1c71c709f81fbf4c54cdde58d649c4d9b2b43e2dc652e-merged.mount: Deactivated successfully.
Dec 05 06:32:11 compute-0 podman[214637]: 2025-12-05 06:32:11.022972894 +0000 UTC m=+0.037892282 container cleanup 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:32:11 compute-0 systemd[1]: libpod-conmon-5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86.scope: Deactivated successfully.
Dec 05 06:32:11 compute-0 podman[214639]: 2025-12-05 06:32:11.030572121 +0000 UTC m=+0.040194391 container remove 5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.035 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[33719290-2388-4c9a-9735-ca8fb1f5b638]: (4, ("Fri Dec  5 06:32:10 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 (5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86)\n5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86\nFri Dec  5 06:32:10 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 (5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86)\n5ddffcff8af582fcd5ad25c04a5afeeae249aa081400612a8c26827e96850a86\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.036 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[47818e28-f136-44e7-ae8c-0f321b6db5eb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.037 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.037 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a17c21-6ab6-438b-9b6f-b5a6950cadb7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.039 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.044 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 kernel: tapb1b8634d-60: left promiscuous mode
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.054 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.057 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.058 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[48ad36cf-1e02-4c1e-8673-80b3e5650e6b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.066 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[62f2e3d2-5cbf-476b-b658-d8dc688cb10d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.068 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[fdcbbded-df55-4442-86af-35f15fcada16]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.075 186333 INFO nova.virt.libvirt.driver [-] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Instance destroyed successfully.
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.075 186333 DEBUG nova.objects.instance [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lazy-loading 'resources' on Instance uuid 8378e152-c0bc-4b50-889d-5ed87ecd729c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.081 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[083bfe07-8bc6-4e6e-a6ee-139f0a4fa19f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415756, 'reachable_time': 34500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214678, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 systemd[1]: run-netns-ovnmeta\x2db1b8634d\x2d6c3c\x2d4ad1\x2dbde2\x2d3aa71bdb92e4.mount: Deactivated successfully.
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.085 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:32:11 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:11.085 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[849fdd49-841c-43fb-b0a3-b54275e3a61e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.579 186333 DEBUG nova.virt.libvirt.vif [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-1236199558',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-1236199',id=22,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:30:25Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-1zmw8e93',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:31:30Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=8378e152-c0bc-4b50-889d-5ed87ecd729c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.580 186333 DEBUG nova.network.os_vif_util [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converting VIF {"id": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "address": "fa:16:3e:5a:41:f7", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb9012b-dc", "ovs_interfaceid": "9eb9012b-dce5-48ed-afc1-5fccc0654e2e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.580 186333 DEBUG nova.network.os_vif_util [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.580 186333 DEBUG os_vif [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.582 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.582 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9eb9012b-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.583 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.585 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.585 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.585 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.586 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4b062bbb-e1ae-4267-bc32-44a6731dae2f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.586 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.587 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.588 186333 INFO os_vif [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5a:41:f7,bridge_name='br-int',has_traffic_filtering=True,id=9eb9012b-dce5-48ed-afc1-5fccc0654e2e,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb9012b-dc')
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.589 186333 INFO nova.virt.libvirt.driver [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Deleting instance files /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c_del
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.589 186333 INFO nova.virt.libvirt.driver [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Deletion of /var/lib/nova/instances/8378e152-c0bc-4b50-889d-5ed87ecd729c_del complete
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.674 186333 DEBUG nova.compute.manager [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Received event network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.675 186333 DEBUG oslo_concurrency.lockutils [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.675 186333 DEBUG oslo_concurrency.lockutils [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.675 186333 DEBUG oslo_concurrency.lockutils [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.675 186333 DEBUG nova.compute.manager [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] No waiting events found dispatching network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:32:11 compute-0 nova_compute[186329]: 2025-12-05 06:32:11.675 186333 DEBUG nova.compute.manager [req-fcde2355-6abb-42a4-adde-c0aa0b1d6b05 req-f30fdbcd-23e6-46ff-bf3f-4da13c34790a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Received event network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.096 186333 INFO nova.compute.manager [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Took 1.27 seconds to destroy the instance on the hypervisor.
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.097 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.097 186333 DEBUG nova.compute.manager [-] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.097 186333 DEBUG nova.network.neutron [-] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.097 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:32:12 compute-0 nova_compute[186329]: 2025-12-05 06:32:12.636 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.371 186333 DEBUG nova.network.neutron [-] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.716 186333 DEBUG nova.compute.manager [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Received event network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.716 186333 DEBUG oslo_concurrency.lockutils [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.717 186333 DEBUG oslo_concurrency.lockutils [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.717 186333 DEBUG oslo_concurrency.lockutils [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.717 186333 DEBUG nova.compute.manager [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] No waiting events found dispatching network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.717 186333 DEBUG nova.compute.manager [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Received event network-vif-unplugged-9eb9012b-dce5-48ed-afc1-5fccc0654e2e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.717 186333 DEBUG nova.compute.manager [req-1cb4489b-3933-4959-9e8d-b0616d14c24b req-1012497b-0d8c-4a84-84ef-5ca71ec685a7 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Received event network-vif-deleted-9eb9012b-dce5-48ed-afc1-5fccc0654e2e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:32:13 compute-0 nova_compute[186329]: 2025-12-05 06:32:13.875 186333 INFO nova.compute.manager [-] [instance: 8378e152-c0bc-4b50-889d-5ed87ecd729c] Took 1.78 seconds to deallocate network for instance.
Dec 05 06:32:14 compute-0 nova_compute[186329]: 2025-12-05 06:32:14.028 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:14 compute-0 nova_compute[186329]: 2025-12-05 06:32:14.386 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:14 compute-0 nova_compute[186329]: 2025-12-05 06:32:14.387 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:14 compute-0 nova_compute[186329]: 2025-12-05 06:32:14.432 186333 DEBUG nova.compute.provider_tree [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:32:14 compute-0 nova_compute[186329]: 2025-12-05 06:32:14.936 186333 DEBUG nova.scheduler.client.report [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:32:15 compute-0 nova_compute[186329]: 2025-12-05 06:32:15.442 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.055s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:15 compute-0 nova_compute[186329]: 2025-12-05 06:32:15.459 186333 INFO nova.scheduler.client.report [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Deleted allocations for instance 8378e152-c0bc-4b50-889d-5ed87ecd729c
Dec 05 06:32:16 compute-0 nova_compute[186329]: 2025-12-05 06:32:16.476 186333 DEBUG oslo_concurrency.lockutils [None req-30ce0d2b-c33d-4ce2-9d00-ac2a316363cd 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "8378e152-c0bc-4b50-889d-5ed87ecd729c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.162s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:16 compute-0 nova_compute[186329]: 2025-12-05 06:32:16.587 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:19 compute-0 nova_compute[186329]: 2025-12-05 06:32:19.029 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:20 compute-0 podman[214681]: 2025-12-05 06:32:20.466499211 +0000 UTC m=+0.051377047 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:32:20 compute-0 podman[214682]: 2025-12-05 06:32:20.469607905 +0000 UTC m=+0.053332413 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Dec 05 06:32:20 compute-0 podman[214683]: 2025-12-05 06:32:20.469631299 +0000 UTC m=+0.051447509 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:32:21 compute-0 nova_compute[186329]: 2025-12-05 06:32:21.587 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:24 compute-0 nova_compute[186329]: 2025-12-05 06:32:24.030 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:24.889 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:32:24 compute-0 nova_compute[186329]: 2025-12-05 06:32:24.890 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:24 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:24.890 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:32:26 compute-0 nova_compute[186329]: 2025-12-05 06:32:26.589 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:29 compute-0 nova_compute[186329]: 2025-12-05 06:32:29.031 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:29.519 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:32:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:29.520 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:32:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:29.520 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:32:29 compute-0 podman[196599]: time="2025-12-05T06:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:32:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:32:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:32:30 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:32:30.891 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: ERROR   06:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: ERROR   06:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: ERROR   06:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: ERROR   06:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: ERROR   06:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:32:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:32:31 compute-0 nova_compute[186329]: 2025-12-05 06:32:31.590 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:34 compute-0 nova_compute[186329]: 2025-12-05 06:32:34.033 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:36 compute-0 nova_compute[186329]: 2025-12-05 06:32:36.591 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:39 compute-0 nova_compute[186329]: 2025-12-05 06:32:39.034 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:41 compute-0 podman[214737]: 2025-12-05 06:32:41.454430619 +0000 UTC m=+0.037278038 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:32:41 compute-0 podman[214736]: 2025-12-05 06:32:41.504482331 +0000 UTC m=+0.089628232 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:32:41 compute-0 nova_compute[186329]: 2025-12-05 06:32:41.593 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:44 compute-0 nova_compute[186329]: 2025-12-05 06:32:44.035 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:46 compute-0 nova_compute[186329]: 2025-12-05 06:32:46.595 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:49 compute-0 nova_compute[186329]: 2025-12-05 06:32:49.037 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:51 compute-0 podman[214780]: 2025-12-05 06:32:51.453146925 +0000 UTC m=+0.038972033 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:32:51 compute-0 podman[214782]: 2025-12-05 06:32:51.46343129 +0000 UTC m=+0.044284180 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 05 06:32:51 compute-0 podman[214781]: 2025-12-05 06:32:51.48010032 +0000 UTC m=+0.064493497 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:32:51 compute-0 nova_compute[186329]: 2025-12-05 06:32:51.596 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:54 compute-0 nova_compute[186329]: 2025-12-05 06:32:54.038 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:56 compute-0 nova_compute[186329]: 2025-12-05 06:32:56.598 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:59 compute-0 nova_compute[186329]: 2025-12-05 06:32:59.039 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:32:59 compute-0 nova_compute[186329]: 2025-12-05 06:32:59.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:32:59 compute-0 podman[196599]: time="2025-12-05T06:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:32:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:32:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 05 06:33:00 compute-0 nova_compute[186329]: 2025-12-05 06:33:00.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:00 compute-0 nova_compute[186329]: 2025-12-05 06:33:00.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: ERROR   06:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: ERROR   06:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: ERROR   06:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: ERROR   06:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: ERROR   06:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:33:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:33:01 compute-0 nova_compute[186329]: 2025-12-05 06:33:01.600 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:01 compute-0 nova_compute[186329]: 2025-12-05 06:33:01.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:01 compute-0 nova_compute[186329]: 2025-12-05 06:33:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:02 compute-0 nova_compute[186329]: 2025-12-05 06:33:02.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:02 compute-0 nova_compute[186329]: 2025-12-05 06:33:02.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:02 compute-0 nova_compute[186329]: 2025-12-05 06:33:02.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:02 compute-0 nova_compute[186329]: 2025-12-05 06:33:02.221 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.287 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.288 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.327 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.330 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.369 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.370 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.410 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.590 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.591 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.605 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.605 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5452MB free_disk=73.10820388793945GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.605 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:03 compute-0 nova_compute[186329]: 2025-12-05 06:33:03.606 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:04 compute-0 nova_compute[186329]: 2025-12-05 06:33:04.040 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.157 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.157 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.157 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:33:03 up  1:11,  0 user,  load average: 0.05, 0.09, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.172 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.189 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.190 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.198 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.210 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.241 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:33:05 compute-0 nova_compute[186329]: 2025-12-05 06:33:05.745 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:33:06 compute-0 nova_compute[186329]: 2025-12-05 06:33:06.251 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:33:06 compute-0 nova_compute[186329]: 2025-12-05 06:33:06.251 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.645s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:06 compute-0 nova_compute[186329]: 2025-12-05 06:33:06.602 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.247 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.752 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.752 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.753 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.753 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.894 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Creating tmpfile /var/lib/nova/instances/tmp5z9pjlsn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.894 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:08 compute-0 nova_compute[186329]: 2025-12-05 06:33:08.897 186333 DEBUG nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5z9pjlsn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:33:09 compute-0 nova_compute[186329]: 2025-12-05 06:33:09.041 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:09 compute-0 nova_compute[186329]: 2025-12-05 06:33:09.887 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Creating tmpfile /var/lib/nova/instances/tmp4wjjiu2f to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:33:09 compute-0 nova_compute[186329]: 2025-12-05 06:33:09.888 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:09 compute-0 nova_compute[186329]: 2025-12-05 06:33:09.890 186333 DEBUG nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4wjjiu2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:33:10 compute-0 nova_compute[186329]: 2025-12-05 06:33:10.913 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:11 compute-0 nova_compute[186329]: 2025-12-05 06:33:11.604 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:11 compute-0 nova_compute[186329]: 2025-12-05 06:33:11.913 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:12 compute-0 podman[214847]: 2025-12-05 06:33:12.467319659 +0000 UTC m=+0.037789128 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:33:12 compute-0 podman[214846]: 2025-12-05 06:33:12.49138239 +0000 UTC m=+0.063187100 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2)
Dec 05 06:33:14 compute-0 nova_compute[186329]: 2025-12-05 06:33:14.042 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:15 compute-0 nova_compute[186329]: 2025-12-05 06:33:15.476 186333 DEBUG nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5z9pjlsn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a9c6d4f3-641d-4e96-a015-0a81a2b58225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:33:16 compute-0 nova_compute[186329]: 2025-12-05 06:33:16.485 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:33:16 compute-0 nova_compute[186329]: 2025-12-05 06:33:16.485 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:33:16 compute-0 nova_compute[186329]: 2025-12-05 06:33:16.485 186333 DEBUG nova.network.neutron [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:33:16 compute-0 nova_compute[186329]: 2025-12-05 06:33:16.606 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:16 compute-0 nova_compute[186329]: 2025-12-05 06:33:16.990 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.305 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.446 186333 DEBUG nova.network.neutron [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Updating instance_info_cache with network_info: [{"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.950 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.958 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5z9pjlsn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a9c6d4f3-641d-4e96-a015-0a81a2b58225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.958 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Creating instance directory: /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.959 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Creating disk.info with the contents: {'/var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk': 'qcow2', '/var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.959 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:33:17 compute-0 nova_compute[186329]: 2025-12-05 06:33:17.959 186333 DEBUG nova.objects.instance [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a9c6d4f3-641d-4e96-a015-0a81a2b58225 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.463 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.466 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.467 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.507 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.508 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.508 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.509 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.511 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.511 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.560 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.561 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.578 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.579 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.579 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.617 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.618 186333 DEBUG nova.virt.disk.api [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.618 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.656 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.657 186333 DEBUG nova.virt.disk.api [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:33:18 compute-0 nova_compute[186329]: 2025-12-05 06:33:18.657 186333 DEBUG nova.objects.instance [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid a9c6d4f3-641d-4e96-a015-0a81a2b58225 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.043 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.161 186333 DEBUG nova.objects.base [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<a9c6d4f3-641d-4e96-a015-0a81a2b58225> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.162 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.178 186333 DEBUG oslo_concurrency.processutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk.config 497664" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.179 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.180 186333 DEBUG nova.virt.libvirt.vif [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:32:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-915657501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-9156575',id=24,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:32:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-d0r66gci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:32:35Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=a9c6d4f3-641d-4e96-a015-0a81a2b58225,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.180 186333 DEBUG nova.network.os_vif_util [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.181 186333 DEBUG nova.network.os_vif_util [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.181 186333 DEBUG os_vif [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.182 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.182 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.183 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.183 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.184 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '441a9dba-45a7-5f9f-b887-f8d20f29e4bc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.184 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.185 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.187 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.187 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap271c902e-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.187 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap271c902e-30, col_values=(('qos', UUID('6c9360c5-cf0c-4769-ad7d-34ac0673a321')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.188 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap271c902e-30, col_values=(('external_ids', {'iface-id': '271c902e-30a9-4018-a883-e0977a7c70bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:20:ed:dc', 'vm-uuid': 'a9c6d4f3-641d-4e96-a015-0a81a2b58225'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.188 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:19 compute-0 NetworkManager[55434]: <info>  [1764916399.1893] manager: (tap271c902e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.191 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.192 186333 INFO os_vif [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30')
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.193 186333 DEBUG nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.193 186333 DEBUG nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5z9pjlsn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a9c6d4f3-641d-4e96-a015-0a81a2b58225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.193 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.542 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.985 186333 DEBUG nova.network.neutron [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Port 271c902e-30a9-4018-a883-e0977a7c70bd updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:33:19 compute-0 nova_compute[186329]: 2025-12-05 06:33:19.993 186333 DEBUG nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp5z9pjlsn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a9c6d4f3-641d-4e96-a015-0a81a2b58225',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:33:20 compute-0 ovn_controller[95223]: 2025-12-05T06:33:20Z|00205|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Dec 05 06:33:22 compute-0 podman[214913]: 2025-12-05 06:33:22.457248823 +0000 UTC m=+0.038143084 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251202)
Dec 05 06:33:22 compute-0 podman[214912]: 2025-12-05 06:33:22.459557043 +0000 UTC m=+0.042797554 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, name=ubi9-minimal)
Dec 05 06:33:22 compute-0 podman[214911]: 2025-12-05 06:33:22.481356979 +0000 UTC m=+0.066365836 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 06:33:23 compute-0 kernel: tap271c902e-30: entered promiscuous mode
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.4960] manager: (tap271c902e-30): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Dec 05 06:33:23 compute-0 nova_compute[186329]: 2025-12-05 06:33:23.497 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:23 compute-0 ovn_controller[95223]: 2025-12-05T06:33:23Z|00206|binding|INFO|Claiming lport 271c902e-30a9-4018-a883-e0977a7c70bd for this additional chassis.
Dec 05 06:33:23 compute-0 ovn_controller[95223]: 2025-12-05T06:33:23Z|00207|binding|INFO|271c902e-30a9-4018-a883-e0977a7c70bd: Claiming fa:16:3e:20:ed:dc 10.100.0.4
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.511 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:ed:dc 10.100.0.4'], port_security=['fa:16:3e:20:ed:dc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a9c6d4f3-641d-4e96-a015-0a81a2b58225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=271c902e-30a9-4018-a883-e0977a7c70bd) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:33:23 compute-0 ovn_controller[95223]: 2025-12-05T06:33:23Z|00208|binding|INFO|Setting lport 271c902e-30a9-4018-a883-e0977a7c70bd ovn-installed in OVS
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.511 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 271c902e-30a9-4018-a883-e0977a7c70bd in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.512 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:33:23 compute-0 nova_compute[186329]: 2025-12-05 06:33:23.512 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.521 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[824ef6a9-05fb-402b-9e75-60e3184f6623]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 nova_compute[186329]: 2025-12-05 06:33:23.520 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.521 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb1b8634d-61 in ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.522 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb1b8634d-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.522 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[27ea3240-4dd3-4f76-b71a-b1cf476596c5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.523 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[a47dcaaf-5bdd-40ea-9795-3cca1e152bf8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 systemd-udevd[214979]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.531 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[08f81be4-26fb-4491-81a1-8b66262dd1af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 systemd-machined[152967]: New machine qemu-19-instance-00000018.
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.5369] device (tap271c902e-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.5378] device (tap271c902e-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:33:23 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000018.
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.545 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f2bbcc06-04d4-4422-bca8-d5121e69e0ac]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.563 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[0d9860c7-c549-450e-93f1-5d199fab2237]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.5669] manager: (tapb1b8634d-60): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.566 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[df616415-ea94-4cf0-9997-1785f88a0048]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.590 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[574c1d36-042f-4195-b728-9ed160e300b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.592 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbf940c-6e46-4a68-94f2-af485f371b07]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.6059] device (tapb1b8634d-60): carrier: link connected
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.608 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb4ebad-71d5-4f5e-b8e5-f6a5fa17efd6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.619 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cba6c249-aafb-42fe-b229-ffa864d08e92]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428214, 'reachable_time': 42190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215004, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.627 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2af39d7b-1827-4f22-9ed5-96f9431d33b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:d262'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428214, 'tstamp': 428214}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215005, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.637 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2a799e27-0b93-4544-9876-1548d992c283]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428214, 'reachable_time': 42190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215006, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.656 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7c8b0a79-91af-4b0a-bdfe-170df5aeb41d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.695 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[91c2c871-db9a-4d63-bc2a-81435d0b0907]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.696 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.696 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.697 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:23 compute-0 nova_compute[186329]: 2025-12-05 06:33:23.698 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:23 compute-0 kernel: tapb1b8634d-60: entered promiscuous mode
Dec 05 06:33:23 compute-0 NetworkManager[55434]: <info>  [1764916403.6985] manager: (tapb1b8634d-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.699 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:23 compute-0 ovn_controller[95223]: 2025-12-05T06:33:23Z|00209|binding|INFO|Releasing lport d9989e5b-035f-46c9-8e0e-5fd96cf594af from this chassis (sb_readonly=0)
Dec 05 06:33:23 compute-0 nova_compute[186329]: 2025-12-05 06:33:23.712 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.713 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[40b85a54-8439-4644-8f83-a4b563bb9505]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.714 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.714 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.714 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.714 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.715 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5c068a7e-4aae-47cd-9d49-4ae707222800]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.715 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.716 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6b4458-5bb3-4cec-9c37-a5d73e8a41fd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.716 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:33:23 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:23.718 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'env', 'PROCESS_TAG=haproxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:33:24 compute-0 podman[215041]: 2025-12-05 06:33:24.019001541 +0000 UTC m=+0.031055109 container create 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 05 06:33:24 compute-0 nova_compute[186329]: 2025-12-05 06:33:24.045 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:24 compute-0 systemd[1]: Started libpod-conmon-268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633.scope.
Dec 05 06:33:24 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:33:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1bea6c7595a888f1f3451a9d5b5f931e5eefaf0296afdd3c6ab9f8700a83256a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:33:24 compute-0 podman[215041]: 2025-12-05 06:33:24.070873416 +0000 UTC m=+0.082926984 container init 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 05 06:33:24 compute-0 podman[215041]: 2025-12-05 06:33:24.075408623 +0000 UTC m=+0.087462191 container start 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 05 06:33:24 compute-0 podman[215041]: 2025-12-05 06:33:24.00399697 +0000 UTC m=+0.016050558 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:33:24 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [NOTICE]   (215056) : New worker (215058) forked
Dec 05 06:33:24 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [NOTICE]   (215056) : Loading success.
Dec 05 06:33:24 compute-0 nova_compute[186329]: 2025-12-05 06:33:24.188 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:25 compute-0 ovn_controller[95223]: 2025-12-05T06:33:25Z|00210|binding|INFO|Claiming lport 271c902e-30a9-4018-a883-e0977a7c70bd for this chassis.
Dec 05 06:33:25 compute-0 ovn_controller[95223]: 2025-12-05T06:33:25Z|00211|binding|INFO|271c902e-30a9-4018-a883-e0977a7c70bd: Claiming fa:16:3e:20:ed:dc 10.100.0.4
Dec 05 06:33:25 compute-0 ovn_controller[95223]: 2025-12-05T06:33:25Z|00212|binding|INFO|Setting lport 271c902e-30a9-4018-a883-e0977a7c70bd up in Southbound
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.676 186333 INFO nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Post operation of migration started
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.677 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.773 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.773 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.829 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.829 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:33:26 compute-0 nova_compute[186329]: 2025-12-05 06:33:26.830 186333 DEBUG nova.network.neutron [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:33:27 compute-0 nova_compute[186329]: 2025-12-05 06:33:27.334 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:27 compute-0 nova_compute[186329]: 2025-12-05 06:33:27.620 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:27 compute-0 nova_compute[186329]: 2025-12-05 06:33:27.737 186333 DEBUG nova.network.neutron [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Updating instance_info_cache with network_info: [{"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:33:28 compute-0 nova_compute[186329]: 2025-12-05 06:33:28.242 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-a9c6d4f3-641d-4e96-a015-0a81a2b58225" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:33:28 compute-0 nova_compute[186329]: 2025-12-05 06:33:28.752 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:28 compute-0 nova_compute[186329]: 2025-12-05 06:33:28.752 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:28 compute-0 nova_compute[186329]: 2025-12-05 06:33:28.753 186333 DEBUG oslo_concurrency.lockutils [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:28 compute-0 nova_compute[186329]: 2025-12-05 06:33:28.756 186333 INFO nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:33:28 compute-0 virtqemud[186605]: Domain id=19 name='instance-00000018' uuid=a9c6d4f3-641d-4e96-a015-0a81a2b58225 is tainted: custom-monitor
Dec 05 06:33:29 compute-0 nova_compute[186329]: 2025-12-05 06:33:29.046 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:29 compute-0 nova_compute[186329]: 2025-12-05 06:33:29.189 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:29.520 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:29.521 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:29.522 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:29 compute-0 podman[196599]: time="2025-12-05T06:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:33:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:33:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec 05 06:33:29 compute-0 nova_compute[186329]: 2025-12-05 06:33:29.760 186333 INFO nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:33:30 compute-0 nova_compute[186329]: 2025-12-05 06:33:30.765 186333 INFO nova.virt.libvirt.driver [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:33:30 compute-0 nova_compute[186329]: 2025-12-05 06:33:30.768 186333 DEBUG nova.compute.manager [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:33:31 compute-0 nova_compute[186329]: 2025-12-05 06:33:31.275 186333 DEBUG nova.objects.instance [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: ERROR   06:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: ERROR   06:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: ERROR   06:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: ERROR   06:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: ERROR   06:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:33:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:33:32 compute-0 nova_compute[186329]: 2025-12-05 06:33:32.286 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:32 compute-0 nova_compute[186329]: 2025-12-05 06:33:32.344 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:32 compute-0 nova_compute[186329]: 2025-12-05 06:33:32.345 186333 WARNING neutronclient.v2_0.client [None req-b5d2fe17-ff97-4c62-abd6-68e0b125d287 e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:34 compute-0 nova_compute[186329]: 2025-12-05 06:33:34.047 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:34 compute-0 nova_compute[186329]: 2025-12-05 06:33:34.191 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:39 compute-0 nova_compute[186329]: 2025-12-05 06:33:39.048 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:39 compute-0 nova_compute[186329]: 2025-12-05 06:33:39.193 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:41 compute-0 nova_compute[186329]: 2025-12-05 06:33:41.349 186333 DEBUG nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4wjjiu2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e148517d-c380-4199-be21-5345ab8bbacd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:33:42 compute-0 nova_compute[186329]: 2025-12-05 06:33:42.358 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:33:42 compute-0 nova_compute[186329]: 2025-12-05 06:33:42.358 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:33:42 compute-0 nova_compute[186329]: 2025-12-05 06:33:42.359 186333 DEBUG nova.network.neutron [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:33:42 compute-0 nova_compute[186329]: 2025-12-05 06:33:42.864 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:43 compute-0 podman[215072]: 2025-12-05 06:33:43.455444847 +0000 UTC m=+0.038193188 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:33:43 compute-0 podman[215071]: 2025-12-05 06:33:43.487429202 +0000 UTC m=+0.071936461 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:33:43 compute-0 nova_compute[186329]: 2025-12-05 06:33:43.905 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:44 compute-0 nova_compute[186329]: 2025-12-05 06:33:44.050 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:44 compute-0 nova_compute[186329]: 2025-12-05 06:33:44.193 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:44 compute-0 nova_compute[186329]: 2025-12-05 06:33:44.750 186333 DEBUG nova.network.neutron [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Updating instance_info_cache with network_info: [{"id": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "address": "fa:16:3e:13:6e:e4", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a0d8e58-c4", "ovs_interfaceid": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.254 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.262 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4wjjiu2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e148517d-c380-4199-be21-5345ab8bbacd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.262 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Creating instance directory: /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.262 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Creating disk.info with the contents: {'/var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk': 'qcow2', '/var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.263 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.263 186333 DEBUG nova.objects.instance [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.767 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.770 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.771 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.814 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.815 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.815 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.815 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.818 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.818 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.859 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.860 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.880 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk 1073741824" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.880 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.065s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.881 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.921 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.922 186333 DEBUG nova.virt.disk.api [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.922 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.964 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.964 186333 DEBUG nova.virt.disk.api [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:33:45 compute-0 nova_compute[186329]: 2025-12-05 06:33:45.964 186333 DEBUG nova.objects.instance [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.469 186333 DEBUG nova.objects.base [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<e148517d-c380-4199-be21-5345ab8bbacd> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.469 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.487 186333 DEBUG oslo_concurrency.processutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk.config 497664" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.487 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.488 186333 DEBUG nova.virt.libvirt.vif [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:32:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-9102637',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-9102637',id=25,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:32:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-seu8uf0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:32:52Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=e148517d-c380-4199-be21-5345ab8bbacd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "address": "fa:16:3e:13:6e:e4", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2a0d8e58-c4", "ovs_interfaceid": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.489 186333 DEBUG nova.network.os_vif_util [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "address": "fa:16:3e:13:6e:e4", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2a0d8e58-c4", "ovs_interfaceid": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.489 186333 DEBUG nova.network.os_vif_util [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:6e:e4,bridge_name='br-int',has_traffic_filtering=True,id=2a0d8e58-c481-432e-93eb-24622dcdafe6,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a0d8e58-c4') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.490 186333 DEBUG os_vif [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:6e:e4,bridge_name='br-int',has_traffic_filtering=True,id=2a0d8e58-c481-432e-93eb-24622dcdafe6,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a0d8e58-c4') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.490 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.491 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.491 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.492 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.492 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'b68a09fe-44ab-5bdc-b766-f393d89e6cc6', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.493 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.494 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.495 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.496 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2a0d8e58-c4, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.496 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2a0d8e58-c4, col_values=(('qos', UUID('b9d2fb4f-a176-4723-8487-6efaba629b91')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.496 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2a0d8e58-c4, col_values=(('external_ids', {'iface-id': '2a0d8e58-c481-432e-93eb-24622dcdafe6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:6e:e4', 'vm-uuid': 'e148517d-c380-4199-be21-5345ab8bbacd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.497 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 NetworkManager[55434]: <info>  [1764916426.4981] manager: (tap2a0d8e58-c4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.499 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.503 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.503 186333 INFO os_vif [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:6e:e4,bridge_name='br-int',has_traffic_filtering=True,id=2a0d8e58-c481-432e-93eb-24622dcdafe6,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2a0d8e58-c4')
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.504 186333 DEBUG nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.504 186333 DEBUG nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4wjjiu2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e148517d-c380-4199-be21-5345ab8bbacd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.505 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.670 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:46.847 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:33:46 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:46.848 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:33:46 compute-0 nova_compute[186329]: 2025-12-05 06:33:46.848 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:47 compute-0 nova_compute[186329]: 2025-12-05 06:33:47.250 186333 DEBUG nova.network.neutron [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Port 2a0d8e58-c481-432e-93eb-24622dcdafe6 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:33:47 compute-0 nova_compute[186329]: 2025-12-05 06:33:47.256 186333 DEBUG nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4wjjiu2f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e148517d-c380-4199-be21-5345ab8bbacd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:33:49 compute-0 nova_compute[186329]: 2025-12-05 06:33:49.052 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:50 compute-0 NetworkManager[55434]: <info>  [1764916430.5442] manager: (tap2a0d8e58-c4): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Dec 05 06:33:50 compute-0 kernel: tap2a0d8e58-c4: entered promiscuous mode
Dec 05 06:33:50 compute-0 ovn_controller[95223]: 2025-12-05T06:33:50Z|00213|binding|INFO|Claiming lport 2a0d8e58-c481-432e-93eb-24622dcdafe6 for this additional chassis.
Dec 05 06:33:50 compute-0 ovn_controller[95223]: 2025-12-05T06:33:50Z|00214|binding|INFO|2a0d8e58-c481-432e-93eb-24622dcdafe6: Claiming fa:16:3e:13:6e:e4 10.100.0.5
Dec 05 06:33:50 compute-0 nova_compute[186329]: 2025-12-05 06:33:50.548 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:50 compute-0 ovn_controller[95223]: 2025-12-05T06:33:50Z|00215|binding|INFO|Setting lport 2a0d8e58-c481-432e-93eb-24622dcdafe6 ovn-installed in OVS
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.562 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:6e:e4 10.100.0.5'], port_security=['fa:16:3e:13:6e:e4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e148517d-c380-4199-be21-5345ab8bbacd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=2a0d8e58-c481-432e-93eb-24622dcdafe6) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.563 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 2a0d8e58-c481-432e-93eb-24622dcdafe6 in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:33:50 compute-0 nova_compute[186329]: 2025-12-05 06:33:50.563 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.564 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:33:50 compute-0 nova_compute[186329]: 2025-12-05 06:33:50.566 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:50 compute-0 systemd-udevd[215153]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.575 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7a125113-1f01-4c5d-9849-9c461060a33f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 systemd-machined[152967]: New machine qemu-20-instance-00000019.
Dec 05 06:33:50 compute-0 NetworkManager[55434]: <info>  [1764916430.5826] device (tap2a0d8e58-c4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:33:50 compute-0 NetworkManager[55434]: <info>  [1764916430.5832] device (tap2a0d8e58-c4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:33:50 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000019.
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.607 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[738e539b-a552-4d49-9c68-730b979f08c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.608 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[75ad9b82-bbee-406e-a530-d307cf47825e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.627 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[34e66274-6c6f-4c0b-a188-d563623c4cf5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.640 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c4ce49-dc02-40ec-b4d2-31498f51a9aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 26, 'tx_packets': 5, 'rx_bytes': 1372, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428214, 'reachable_time': 42190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215167, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.652 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6e960d20-b9b0-41d4-97ac-b15c885449d3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428220, 'tstamp': 428220}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215168, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428222, 'tstamp': 428222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215168, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.653 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:50 compute-0 nova_compute[186329]: 2025-12-05 06:33:50.654 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.656 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.656 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.656 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.657 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:33:50 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:50.657 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[115b9a60-6841-4a69-8d0f-96eabfa53d1f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:33:51 compute-0 nova_compute[186329]: 2025-12-05 06:33:51.499 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:53 compute-0 podman[215184]: 2025-12-05 06:33:53.459237055 +0000 UTC m=+0.043853811 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:33:53 compute-0 podman[215186]: 2025-12-05 06:33:53.472444637 +0000 UTC m=+0.053977495 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 05 06:33:53 compute-0 podman[215185]: 2025-12-05 06:33:53.494391629 +0000 UTC m=+0.077795957 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Dec 05 06:33:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:33:53.849 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:33:54 compute-0 nova_compute[186329]: 2025-12-05 06:33:54.053 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:54 compute-0 ovn_controller[95223]: 2025-12-05T06:33:54Z|00216|binding|INFO|Claiming lport 2a0d8e58-c481-432e-93eb-24622dcdafe6 for this chassis.
Dec 05 06:33:54 compute-0 ovn_controller[95223]: 2025-12-05T06:33:54Z|00217|binding|INFO|2a0d8e58-c481-432e-93eb-24622dcdafe6: Claiming fa:16:3e:13:6e:e4 10.100.0.5
Dec 05 06:33:54 compute-0 ovn_controller[95223]: 2025-12-05T06:33:54Z|00218|binding|INFO|Setting lport 2a0d8e58-c481-432e-93eb-24622dcdafe6 up in Southbound
Dec 05 06:33:55 compute-0 nova_compute[186329]: 2025-12-05 06:33:55.817 186333 INFO nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Post operation of migration started
Dec 05 06:33:55 compute-0 nova_compute[186329]: 2025-12-05 06:33:55.818 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:55 compute-0 nova_compute[186329]: 2025-12-05 06:33:55.879 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:55 compute-0 nova_compute[186329]: 2025-12-05 06:33:55.879 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:56 compute-0 nova_compute[186329]: 2025-12-05 06:33:56.501 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:56 compute-0 nova_compute[186329]: 2025-12-05 06:33:56.516 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:33:56 compute-0 nova_compute[186329]: 2025-12-05 06:33:56.516 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:33:56 compute-0 nova_compute[186329]: 2025-12-05 06:33:56.516 186333 DEBUG nova.network.neutron [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:33:57 compute-0 nova_compute[186329]: 2025-12-05 06:33:57.020 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:57 compute-0 nova_compute[186329]: 2025-12-05 06:33:57.862 186333 WARNING neutronclient.v2_0.client [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:33:57 compute-0 nova_compute[186329]: 2025-12-05 06:33:57.980 186333 DEBUG nova.network.neutron [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Updating instance_info_cache with network_info: [{"id": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "address": "fa:16:3e:13:6e:e4", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2a0d8e58-c4", "ovs_interfaceid": "2a0d8e58-c481-432e-93eb-24622dcdafe6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:33:58 compute-0 nova_compute[186329]: 2025-12-05 06:33:58.484 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:33:58 compute-0 nova_compute[186329]: 2025-12-05 06:33:58.995 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:33:58 compute-0 nova_compute[186329]: 2025-12-05 06:33:58.995 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:33:58 compute-0 nova_compute[186329]: 2025-12-05 06:33:58.996 186333 DEBUG oslo_concurrency.lockutils [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:33:58 compute-0 nova_compute[186329]: 2025-12-05 06:33:58.999 186333 INFO nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Dec 05 06:33:59 compute-0 virtqemud[186605]: Domain id=20 name='instance-00000019' uuid=e148517d-c380-4199-be21-5345ab8bbacd is tainted: custom-monitor
Dec 05 06:33:59 compute-0 nova_compute[186329]: 2025-12-05 06:33:59.054 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:33:59 compute-0 podman[196599]: time="2025-12-05T06:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:33:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18590 "" "Go-http-client/1.1"
Dec 05 06:33:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec 05 06:34:00 compute-0 nova_compute[186329]: 2025-12-05 06:34:00.003 186333 INFO nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Dec 05 06:34:00 compute-0 nova_compute[186329]: 2025-12-05 06:34:00.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.007 186333 INFO nova.virt.libvirt.driver [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.010 186333 DEBUG nova.compute.manager [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: ERROR   06:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: ERROR   06:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: ERROR   06:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: ERROR   06:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: ERROR   06:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:34:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.502 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.516 186333 DEBUG nova.objects.instance [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:01 compute-0 nova_compute[186329]: 2025-12-05 06:34:01.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:34:02 compute-0 nova_compute[186329]: 2025-12-05 06:34:02.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:02 compute-0 nova_compute[186329]: 2025-12-05 06:34:02.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.217 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.218 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.218 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server [None req-312aa44b-be1f-4441-a186-f236a6d321cf e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Exception during message handling: nova.exception_Remote.UnexpectedDeletingTaskStateError_Remote: Conflict updating instance e148517d-c380-4199-be21-5345ab8bbacd. Expected: {'task_state': ['migrating']}. Actual: {'task_state': 'deleting'}
Dec 05 06:34:03 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2386, in _instance_update
Dec 05 06:34:03 compute-0 nova_compute[186329]:     update_on_match(compare, 'uuid', updates)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/orm.py", line 52, in update_on_match
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return update_match.update_on_match(
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Dec 05 06:34:03 compute-0 nova_compute[186329]:     raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: During handling of the above exception, another exception occurred:
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:34:03 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:34:03 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:34:03 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:34:03 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:34:03 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2303, in instance_update_and_get_original
Dec 05 06:34:03 compute-0 nova_compute[186329]:     return (copy.copy(instance_ref), _instance_update(
Dec 05 06:34:03 compute-0 nova_compute[186329]:                                      ^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2445, in _instance_update
Dec 05 06:34:03 compute-0 nova_compute[186329]:     raise exc(**exc_props)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 
Dec 05 06:34:03 compute-0 nova_compute[186329]: nova.exception.UnexpectedDeletingTaskStateError: Conflict updating instance e148517d-c380-4199-be21-5345ab8bbacd. Expected: {'task_state': ['migrating']}. Actual: {'task_state': 'deleting'}
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 213, in decorated_function
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     updates, result = self.indirection_api.object_action(
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     result = self.transport._send(
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return self._driver.send(target, ctxt, message,
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise result
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server nova.exception_Remote.UnexpectedDeletingTaskStateError_Remote: Conflict updating instance e148517d-c380-4199-be21-5345ab8bbacd. Expected: {'task_state': ['migrating']}. Actual: {'task_state': 'deleting'}
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2386, in _instance_update
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     update_on_match(compare, 'uuid', updates)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/orm.py", line 52, in update_on_match
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return update_match.update_on_match(
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return getattr(target, method)(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return fn(self, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return f(context, *args, **kwargs)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2303, in instance_update_and_get_original
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     return (copy.copy(instance_ref), _instance_update(
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server                                      ^^^^^^^^^^^^^^^^^
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2445, in _instance_update
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server     raise exc(**exc_props)
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server nova.exception.UnexpectedDeletingTaskStateError: Conflict updating instance e148517d-c380-4199-be21-5345ab8bbacd. Expected: {'task_state': ['migrating']}. Actual: {'task_state': 'deleting'}
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:03 compute-0 nova_compute[186329]: 2025-12-05 06:34:03.543 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.055 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.246 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.288 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.289 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.331 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.334 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.376 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.376 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.418 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.422 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.474 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.474 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.517 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.521 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.563 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.563 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.607 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.664 104041 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 30a6a332-0943-4e50-885d-424fcc88451e with type ""
Dec 05 06:34:04 compute-0 ovn_controller[95223]: 2025-12-05T06:34:04Z|00219|binding|INFO|Removing iface tap2a0d8e58-c4 ovn-installed in OVS
Dec 05 06:34:04 compute-0 ovn_controller[95223]: 2025-12-05T06:34:04Z|00220|binding|INFO|Removing lport 2a0d8e58-c481-432e-93eb-24622dcdafe6 ovn-installed in OVS
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.665 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.672 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:6e:e4 10.100.0.5'], port_security=['fa:16:3e:13:6e:e4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e148517d-c380-4199-be21-5345ab8bbacd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=2a0d8e58-c481-432e-93eb-24622dcdafe6) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.673 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 2a0d8e58-c481-432e-93eb-24622dcdafe6 in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.674 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.681 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.689 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[cff5967e-a4f9-4cae-a873-df55a042c81f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.713 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f5047725-0187-440e-bc88-2583e69fc5bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.715 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[4d699c81-241c-4d65-9e30-3f6211d533d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.740 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e9eebd6b-ac0b-425e-b75c-d56ac95f4c40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.758 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[87b041e0-019b-4d7e-8fed-57272b352b61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb1b8634d-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:d2:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 46, 'tx_packets': 7, 'rx_bytes': 2212, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428214, 'reachable_time': 42190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215265, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.773 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2a2100-4442-4311-a500-415b8d240000]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428220, 'tstamp': 428220}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215266, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb1b8634d-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428222, 'tstamp': 428222}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215266, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.775 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.776 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.777 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb1b8634d-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.777 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.778 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb1b8634d-60, col_values=(('external_ids', {'iface-id': 'd9989e5b-035f-46c9-8e0e-5fd96cf594af'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.778 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:34:04 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:04.780 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[176723f0-a2ce-4267-b5e3-e0d05899b444]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.852 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.853 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.868 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.868 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5132MB free_disk=73.04639053344727GB free_vcpus=0 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.868 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:04 compute-0 nova_compute[186329]: 2025-12-05 06:34:04.868 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:06 compute-0 nova_compute[186329]: 2025-12-05 06:34:06.387 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Migration for instance e148517d-c380-4199-be21-5345ab8bbacd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Dec 05 06:34:06 compute-0 nova_compute[186329]: 2025-12-05 06:34:06.505 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:06 compute-0 nova_compute[186329]: 2025-12-05 06:34:06.890 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.412 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.412 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance a9c6d4f3-641d-4e96-a015-0a81a2b58225 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.412 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.412 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:34:04 up  1:12,  0 user,  load average: 0.02, 0.07, 0.16\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_3c9c02fc0c8641c48e6ccfd619fde68d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.450 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.937 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.937 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.937 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.938 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.938 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.945 186333 INFO nova.compute.manager [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Terminating instance
Dec 05 06:34:07 compute-0 nova_compute[186329]: 2025-12-05 06:34:07.952 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.454 186333 DEBUG nova.compute.manager [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.457 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.457 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.589s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:08 compute-0 kernel: tap271c902e-30 (unregistering): left promiscuous mode
Dec 05 06:34:08 compute-0 NetworkManager[55434]: <info>  [1764916448.4770] device (tap271c902e-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:34:08 compute-0 ovn_controller[95223]: 2025-12-05T06:34:08Z|00221|binding|INFO|Releasing lport 271c902e-30a9-4018-a883-e0977a7c70bd from this chassis (sb_readonly=0)
Dec 05 06:34:08 compute-0 ovn_controller[95223]: 2025-12-05T06:34:08Z|00222|binding|INFO|Setting lport 271c902e-30a9-4018-a883-e0977a7c70bd down in Southbound
Dec 05 06:34:08 compute-0 ovn_controller[95223]: 2025-12-05T06:34:08Z|00223|binding|INFO|Removing iface tap271c902e-30 ovn-installed in OVS
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.486 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.492 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:20:ed:dc 10.100.0.4'], port_security=['fa:16:3e:20:ed:dc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a9c6d4f3-641d-4e96-a015-0a81a2b58225', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3c9c02fc0c8641c48e6ccfd619fde68d', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'bcbaedeb-3d7e-4385-ba5a-8212e2f97a4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdc9478a-32e1-4f82-931f-bf423e0028ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=271c902e-30a9-4018-a883-e0977a7c70bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.494 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 271c902e-30a9-4018-a883-e0977a7c70bd in datapath b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 unbound from our chassis
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.497 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.498 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0fbce470-f193-46a6-96de-b783181c744e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.499 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 namespace which is not needed anymore
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.499 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec 05 06:34:08 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Consumed 2.300s CPU time.
Dec 05 06:34:08 compute-0 systemd-machined[152967]: Machine qemu-19-instance-00000018 terminated.
Dec 05 06:34:08 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [NOTICE]   (215056) : haproxy version is 3.0.5-8e879a5
Dec 05 06:34:08 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [NOTICE]   (215056) : path to executable is /usr/sbin/haproxy
Dec 05 06:34:08 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [WARNING]  (215056) : Exiting Master process...
Dec 05 06:34:08 compute-0 podman[215290]: 2025-12-05 06:34:08.583826947 +0000 UTC m=+0.021122583 container kill 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:34:08 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [ALERT]    (215056) : Current worker (215058) exited with code 143 (Terminated)
Dec 05 06:34:08 compute-0 neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4[215052]: [WARNING]  (215056) : All workers exited. Exiting... (0)
Dec 05 06:34:08 compute-0 systemd[1]: libpod-268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633.scope: Deactivated successfully.
Dec 05 06:34:08 compute-0 podman[215302]: 2025-12-05 06:34:08.619994987 +0000 UTC m=+0.021172809 container died 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:34:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633-userdata-shm.mount: Deactivated successfully.
Dec 05 06:34:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-1bea6c7595a888f1f3451a9d5b5f931e5eefaf0296afdd3c6ab9f8700a83256a-merged.mount: Deactivated successfully.
Dec 05 06:34:08 compute-0 podman[215302]: 2025-12-05 06:34:08.643063518 +0000 UTC m=+0.044241320 container cleanup 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:34:08 compute-0 systemd[1]: libpod-conmon-268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633.scope: Deactivated successfully.
Dec 05 06:34:08 compute-0 podman[215304]: 2025-12-05 06:34:08.652745682 +0000 UTC m=+0.049362899 container remove 268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.657 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[143e0019-f1a0-4ea4-ba18-111778518364]: (4, ("Fri Dec  5 06:34:08 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 (268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633)\n268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633\nFri Dec  5 06:34:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 (268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633)\n268eb87f48a950bd84372060ae1dba4dabe839e2534ecd1b0c5def708d8b1633\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.659 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[40e554f3-cc6f-44b4-8469-396d2dc6799e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.659 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.660 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[14df6b95-cf26-4d94-a3d2-e7f387f0a3f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.660 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb1b8634d-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:08 compute-0 kernel: tapb1b8634d-60: left promiscuous mode
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.665 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.666 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.668 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa4f540-77f5-4690-83d0-440e56bbecf8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 NetworkManager[55434]: <info>  [1764916448.6710] manager: (tap271c902e-30): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.683 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.686 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.687 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0177390f-ba1e-49c1-baf7-4d8da2a9b1a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.687 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d601dde3-bf53-4ff6-a28d-0606b81d81ad]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.699 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[eb57f28b-320f-457e-8f3d-8a921bf06199]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428209, 'reachable_time': 34964, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215337, 'error': None, 'target': 'ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 systemd[1]: run-netns-ovnmeta\x2db1b8634d\x2d6c3c\x2d4ad1\x2dbde2\x2d3aa71bdb92e4.mount: Deactivated successfully.
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.700 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:34:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:08.700 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[07b3c3a7-aa16-4921-b0b4-12615bb2004c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.705 186333 DEBUG nova.compute.manager [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Received event network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.706 186333 DEBUG oslo_concurrency.lockutils [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.706 186333 DEBUG oslo_concurrency.lockutils [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.706 186333 DEBUG oslo_concurrency.lockutils [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.707 186333 DEBUG nova.compute.manager [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] No waiting events found dispatching network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.707 186333 DEBUG nova.compute.manager [req-2d5f1f2b-fcfc-4ba9-be5d-d814360ae67a req-aeb2efae-0acf-4270-a324-41e0480aee29 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Received event network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.709 186333 INFO nova.virt.libvirt.driver [-] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Instance destroyed successfully.
Dec 05 06:34:08 compute-0 nova_compute[186329]: 2025-12-05 06:34:08.710 186333 DEBUG nova.objects.instance [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lazy-loading 'resources' on Instance uuid a9c6d4f3-641d-4e96-a015-0a81a2b58225 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.057 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.214 186333 DEBUG nova.virt.libvirt.vif [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-12-05T06:32:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadStabilizationStrategy-server-915657501',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadstabilizationstrategy-server-9156575',id=24,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:32:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3c9c02fc0c8641c48e6ccfd619fde68d',ramdisk_id='',reservation_id='r-d0r66gci',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',clean_attempts='1',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151',owner_user_name='tempest-TestExecuteWorkloadStabilizationStrategy-217954151-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:33:31Z,user_data=None,user_id='19182660d2484754b9a921f3caf09b6b',uuid=a9c6d4f3-641d-4e96-a015-0a81a2b58225,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.214 186333 DEBUG nova.network.os_vif_util [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converting VIF {"id": "271c902e-30a9-4018-a883-e0977a7c70bd", "address": "fa:16:3e:20:ed:dc", "network": {"id": "b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadStabilizationStrategy-188557792-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "32d36f4d4d354e18b9270b2f2f540379", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap271c902e-30", "ovs_interfaceid": "271c902e-30a9-4018-a883-e0977a7c70bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.215 186333 DEBUG nova.network.os_vif_util [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.215 186333 DEBUG os_vif [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.217 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.217 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap271c902e-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.218 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.220 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.221 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.221 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6c9360c5-cf0c-4769-ad7d-34ac0673a321) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.221 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.222 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.223 186333 INFO os_vif [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:20:ed:dc,bridge_name='br-int',has_traffic_filtering=True,id=271c902e-30a9-4018-a883-e0977a7c70bd,network=Network(b1b8634d-6c3c-4ad1-bde2-3aa71bdb92e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap271c902e-30')
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.224 186333 INFO nova.virt.libvirt.driver [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Deleting instance files /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225_del
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.224 186333 INFO nova.virt.libvirt.driver [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Deletion of /var/lib/nova/instances/a9c6d4f3-641d-4e96-a015-0a81a2b58225_del complete
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.731 186333 INFO nova.compute.manager [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Took 1.28 seconds to destroy the instance on the hypervisor.
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.732 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.732 186333 DEBUG nova.compute.manager [-] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.732 186333 DEBUG nova.network.neutron [-] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:34:09 compute-0 nova_compute[186329]: 2025-12-05 06:34:09.733 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.457 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.458 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.458 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.679 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.744 186333 DEBUG nova.compute.manager [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Received event network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.744 186333 DEBUG oslo_concurrency.lockutils [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.745 186333 DEBUG oslo_concurrency.lockutils [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.745 186333 DEBUG oslo_concurrency.lockutils [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.745 186333 DEBUG nova.compute.manager [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] No waiting events found dispatching network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:34:10 compute-0 nova_compute[186329]: 2025-12-05 06:34:10.745 186333 DEBUG nova.compute.manager [req-cb3d4d19-8744-4b48-ba5c-5ac86c848845 req-82011b45-5b52-4d6e-bfdb-49f7227523d1 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Received event network-vif-unplugged-271c902e-30a9-4018-a883-e0977a7c70bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:34:12 compute-0 nova_compute[186329]: 2025-12-05 06:34:12.191 186333 DEBUG nova.network.neutron [-] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:34:12 compute-0 nova_compute[186329]: 2025-12-05 06:34:12.694 186333 INFO nova.compute.manager [-] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Took 2.96 seconds to deallocate network for instance.
Dec 05 06:34:12 compute-0 nova_compute[186329]: 2025-12-05 06:34:12.793 186333 DEBUG nova.compute.manager [req-76846708-f42a-46c9-9f5c-3d693221055e req-626d33d4-050a-458f-836f-b8d976804ffb fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: a9c6d4f3-641d-4e96-a015-0a81a2b58225] Received event network-vif-deleted-271c902e-30a9-4018-a883-e0977a7c70bd external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:34:13 compute-0 nova_compute[186329]: 2025-12-05 06:34:13.205 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:13 compute-0 nova_compute[186329]: 2025-12-05 06:34:13.206 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:13 compute-0 nova_compute[186329]: 2025-12-05 06:34:13.252 186333 DEBUG nova.compute.provider_tree [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:34:13 compute-0 nova_compute[186329]: 2025-12-05 06:34:13.756 186333 DEBUG nova.scheduler.client.report [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:34:14 compute-0 nova_compute[186329]: 2025-12-05 06:34:14.058 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:14 compute-0 nova_compute[186329]: 2025-12-05 06:34:14.222 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:14 compute-0 nova_compute[186329]: 2025-12-05 06:34:14.262 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.056s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:14 compute-0 nova_compute[186329]: 2025-12-05 06:34:14.278 186333 INFO nova.scheduler.client.report [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Deleted allocations for instance a9c6d4f3-641d-4e96-a015-0a81a2b58225
Dec 05 06:34:14 compute-0 podman[215346]: 2025-12-05 06:34:14.456885779 +0000 UTC m=+0.040705903 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:34:14 compute-0 podman[215345]: 2025-12-05 06:34:14.479461483 +0000 UTC m=+0.064293749 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:34:15 compute-0 nova_compute[186329]: 2025-12-05 06:34:15.293 186333 DEBUG oslo_concurrency.lockutils [None req-052b66a1-58bf-4918-9a38-e72c99592762 19182660d2484754b9a921f3caf09b6b 3c9c02fc0c8641c48e6ccfd619fde68d - - default default] Lock "a9c6d4f3-641d-4e96-a015-0a81a2b58225" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.355s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:18 compute-0 nova_compute[186329]: 2025-12-05 06:34:18.096 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:19 compute-0 nova_compute[186329]: 2025-12-05 06:34:19.059 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:19 compute-0 nova_compute[186329]: 2025-12-05 06:34:19.224 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:24 compute-0 nova_compute[186329]: 2025-12-05 06:34:24.060 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:24 compute-0 nova_compute[186329]: 2025-12-05 06:34:24.225 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:24 compute-0 podman[215391]: 2025-12-05 06:34:24.459208805 +0000 UTC m=+0.043183280 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:34:24 compute-0 podman[215390]: 2025-12-05 06:34:24.482477352 +0000 UTC m=+0.068530776 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Dec 05 06:34:24 compute-0 podman[215392]: 2025-12-05 06:34:24.49443856 +0000 UTC m=+0.077081923 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 05 06:34:27 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:27.893 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:a3:1b 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f56ecf15c14a5f9dc32756b35ebccc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85adff9c-df19-421b-8711-6ce155f4dd7c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cc9bed08-ff40-4708-a7bc-12e84332e3cc) old=Port_Binding(mac=['fa:16:3e:e7:a3:1b'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96f56ecf15c14a5f9dc32756b35ebccc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:34:27 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:27.894 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cc9bed08-ff40-4708-a7bc-12e84332e3cc in datapath 12281049-d2b1-40ef-9535-ec69961f84f0 updated
Dec 05 06:34:27 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:27.894 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 12281049-d2b1-40ef-9535-ec69961f84f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:34:27 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:27.895 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6fc4a3-3f62-484e-98c6-27a44324fa17]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:29 compute-0 nova_compute[186329]: 2025-12-05 06:34:29.061 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:29 compute-0 nova_compute[186329]: 2025-12-05 06:34:29.225 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:29.522 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:29.523 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:29.523 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:29 compute-0 podman[196599]: time="2025-12-05T06:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:34:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:34:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: ERROR   06:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: ERROR   06:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: ERROR   06:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: ERROR   06:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: ERROR   06:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:34:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:34:33 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:33.735 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:90:89 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ca625bf8-064f-49ec-9d2a-6496d0d2c88a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca625bf8-064f-49ec-9d2a-6496d0d2c88a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e853db72-5d40-4784-a5b6-6693f29945ce, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=990dfaef-236c-44f2-b08f-640ea7a6b49e) old=Port_Binding(mac=['fa:16:3e:1c:90:89'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ca625bf8-064f-49ec-9d2a-6496d0d2c88a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca625bf8-064f-49ec-9d2a-6496d0d2c88a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:34:33 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:33.735 104041 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 990dfaef-236c-44f2-b08f-640ea7a6b49e in datapath ca625bf8-064f-49ec-9d2a-6496d0d2c88a updated
Dec 05 06:34:33 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:33.736 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca625bf8-064f-49ec-9d2a-6496d0d2c88a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:34:33 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:33.736 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ebae8b8e-7438-4107-be14-609610290554]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:34:34 compute-0 nova_compute[186329]: 2025-12-05 06:34:34.062 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:34 compute-0 nova_compute[186329]: 2025-12-05 06:34:34.227 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:39 compute-0 nova_compute[186329]: 2025-12-05 06:34:39.063 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:39 compute-0 nova_compute[186329]: 2025-12-05 06:34:39.228 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:44 compute-0 nova_compute[186329]: 2025-12-05 06:34:44.064 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:44 compute-0 nova_compute[186329]: 2025-12-05 06:34:44.229 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:45 compute-0 podman[215445]: 2025-12-05 06:34:45.455620212 +0000 UTC m=+0.037623868 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:34:45 compute-0 podman[215444]: 2025-12-05 06:34:45.476386366 +0000 UTC m=+0.060713208 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, managed_by=edpm_ansible)
Dec 05 06:34:49 compute-0 nova_compute[186329]: 2025-12-05 06:34:49.066 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:49 compute-0 nova_compute[186329]: 2025-12-05 06:34:49.229 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:50 compute-0 ovn_controller[95223]: 2025-12-05T06:34:50Z|00224|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 05 06:34:52 compute-0 nova_compute[186329]: 2025-12-05 06:34:52.644 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.153 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Destroying instance with name label 'instance-00000019' which is marked as DELETED but still present on host.
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.658 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "e148517d-c380-4199-be21-5345ab8bbacd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.658 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "e148517d-c380-4199-be21-5345ab8bbacd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.659 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "e148517d-c380-4199-be21-5345ab8bbacd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.659 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Terminating instance
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.659 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:53 compute-0 nova_compute[186329]: 2025-12-05 06:34:53.789 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:53.789 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:34:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:53.790 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.067 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.231 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.667 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.667 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquired lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.667 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:34:54 compute-0 nova_compute[186329]: 2025-12-05 06:34:54.668 186333 WARNING neutronclient.v2_0.client [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.097 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.202 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.203 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Releasing lock "refresh_cache-e148517d-c380-4199-be21-5345ab8bbacd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.203 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:34:55 compute-0 kernel: tap2a0d8e58-c4 (unregistering): left promiscuous mode
Dec 05 06:34:55 compute-0 NetworkManager[55434]: <info>  [1764916495.2276] device (tap2a0d8e58-c4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.232 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:55 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec 05 06:34:55 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000019.scope: Consumed 2.460s CPU time.
Dec 05 06:34:55 compute-0 systemd-machined[152967]: Machine qemu-20-instance-00000019 terminated.
Dec 05 06:34:55 compute-0 podman[215490]: 2025-12-05 06:34:55.299397665 +0000 UTC m=+0.055305251 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 05 06:34:55 compute-0 podman[215493]: 2025-12-05 06:34:55.305661562 +0000 UTC m=+0.060000980 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Dec 05 06:34:55 compute-0 podman[215494]: 2025-12-05 06:34:55.310217077 +0000 UTC m=+0.058130433 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.416 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.419 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.445 186333 INFO nova.virt.libvirt.driver [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Instance destroyed successfully.
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.445 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'numa_topology' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.948 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<e148517d-c380-4199-be21-5345ab8bbacd> lazy-loaded attributes: info_cache,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:34:55 compute-0 nova_compute[186329]: 2025-12-05 06:34:55.949 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'resources' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:56 compute-0 nova_compute[186329]: 2025-12-05 06:34:56.452 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<e148517d-c380-4199-be21-5345ab8bbacd> lazy-loaded attributes: info_cache,numa_topology,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:34:56 compute-0 nova_compute[186329]: 2025-12-05 06:34:56.453 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'system_metadata' on Instance uuid e148517d-c380-4199-be21-5345ab8bbacd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:56 compute-0 nova_compute[186329]: 2025-12-05 06:34:56.958 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<e148517d-c380-4199-be21-5345ab8bbacd> lazy-loaded attributes: info_cache,numa_topology,resources,system_metadata wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:34:56 compute-0 nova_compute[186329]: 2025-12-05 06:34:56.958 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Deleting instance files /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd_del
Dec 05 06:34:56 compute-0 nova_compute[186329]: 2025-12-05 06:34:56.959 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Deletion of /var/lib/nova/instances/e148517d-c380-4199-be21-5345ab8bbacd_del complete
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.404 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.404 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.465 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Took 2.26 seconds to destroy the instance on the hypervisor.
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.465 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.465 186333 DEBUG nova.compute.manager [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.465 186333 DEBUG nova.network.neutron [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.466 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.665 186333 DEBUG nova.network.neutron [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.665 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:57 compute-0 nova_compute[186329]: 2025-12-05 06:34:57.907 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2476
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.169 186333 DEBUG nova.network.neutron [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.169 186333 INFO nova.compute.manager [-] [instance: e148517d-c380-4199-be21-5345ab8bbacd] Took 0.70 seconds to deallocate network for instance.
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.169 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Destroying instance with name label 'instance-00000012' which is marked as DELETED but still present on host.
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.439 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.439 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.444 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.444 186333 INFO nova.compute.claims [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Claim successful on node compute-0.ctlplane.example.com
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.673 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "bbcb12e6-ebf7-49e2-847a-65f1b3a3266c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.674 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "bbcb12e6-ebf7-49e2-847a-65f1b3a3266c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.674 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "bbcb12e6-ebf7-49e2-847a-65f1b3a3266c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.674 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Terminating instance
Dec 05 06:34:58 compute-0 nova_compute[186329]: 2025-12-05 06:34:58.674 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.068 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.233 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.488 186333 DEBUG nova.compute.provider_tree [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.690 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.691 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquired lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.691 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.691 186333 WARNING neutronclient.v2_0.client [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.735 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:34:59 compute-0 podman[196599]: time="2025-12-05T06:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:34:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:34:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:34:59 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:34:59.790 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.818 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.819 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Releasing lock "refresh_cache-bbcb12e6-ebf7-49e2-847a-65f1b3a3266c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.819 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:34:59 compute-0 kernel: tap6814c1d4-d0 (unregistering): left promiscuous mode
Dec 05 06:34:59 compute-0 NetworkManager[55434]: <info>  [1764916499.8465] device (tap6814c1d4-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.853 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:34:59 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec 05 06:34:59 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 11.560s CPU time.
Dec 05 06:34:59 compute-0 systemd-machined[152967]: Machine qemu-15-instance-00000012 terminated.
Dec 05 06:34:59 compute-0 nova_compute[186329]: 2025-12-05 06:34:59.992 186333 DEBUG nova.scheduler.client.report [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.054 186333 INFO nova.virt.libvirt.driver [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Instance destroyed successfully.
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.055 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'numa_topology' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.498 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.059s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.499 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2873
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.558 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<bbcb12e6-ebf7-49e2-847a-65f1b3a3266c> lazy-loaded attributes: info_cache,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:00 compute-0 nova_compute[186329]: 2025-12-05 06:35:00.558 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'resources' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.005 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2020
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.006 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.006 186333 WARNING neutronclient.v2_0.client [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.006 186333 WARNING neutronclient.v2_0.client [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.060 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<bbcb12e6-ebf7-49e2-847a-65f1b3a3266c> lazy-loaded attributes: info_cache,numa_topology,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.060 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'system_metadata' on Instance uuid bbcb12e6-ebf7-49e2-847a-65f1b3a3266c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.412 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Successfully created port: 93d39487-d3d7-4773-9646-52ea7199e328 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: ERROR   06:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: ERROR   06:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: ERROR   06:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: ERROR   06:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: ERROR   06:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:35:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.511 186333 INFO nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.563 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<bbcb12e6-ebf7-49e2-847a-65f1b3a3266c> lazy-loaded attributes: info_cache,numa_topology,resources,system_metadata wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.564 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Deleting instance files /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c_del
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.564 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Deletion of /var/lib/nova/instances/bbcb12e6-ebf7-49e2-847a-65f1b3a3266c_del complete
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.901 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Successfully updated port: 93d39487-d3d7-4773-9646-52ea7199e328 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.952 186333 DEBUG nova.compute.manager [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-changed-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.952 186333 DEBUG nova.compute.manager [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Refreshing instance network info cache due to event network-changed-93d39487-d3d7-4773-9646-52ea7199e328. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11821
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.952 186333 DEBUG oslo_concurrency.lockutils [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.952 186333 DEBUG oslo_concurrency.lockutils [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:35:01 compute-0 nova_compute[186329]: 2025-12-05 06:35:01.953 186333 DEBUG nova.network.neutron [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Refreshing network info cache for port 93d39487-d3d7-4773-9646-52ea7199e328 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.015 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2908
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.069 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Took 2.25 seconds to destroy the instance on the hypervisor.
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.069 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.069 186333 DEBUG nova.compute.manager [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.070 186333 DEBUG nova.network.neutron [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.070 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.117 186333 DEBUG nova.network.neutron [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.118 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.405 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.455 186333 WARNING neutronclient.v2_0.client [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.514 186333 DEBUG nova.network.neutron [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.619 186333 DEBUG nova.network.neutron [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.620 186333 DEBUG nova.network.neutron [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.620 186333 INFO nova.compute.manager [-] [instance: bbcb12e6-ebf7-49e2-847a-65f1b3a3266c] Took 0.55 seconds to deallocate network for instance.
Dec 05 06:35:02 compute-0 nova_compute[186329]: 2025-12-05 06:35:02.621 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Destroying instance with name label 'instance-0000000e' which is marked as DELETED but still present on host.
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.024 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2682
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.025 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.026 186333 INFO nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Creating image(s)
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.026 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.026 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.027 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.027 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.030 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.031 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.073 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.073 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.074 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.074 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.077 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.078 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.117 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.118 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.125 186333 DEBUG oslo_concurrency.lockutils [req-db840d94-13f3-448e-ba45-d4f6e67155c4 req-b313d7c5-5063-4539-948f-f9d20e992805 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.126 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "89d73880-ffbb-49c5-9e2d-a49a64c44523-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.127 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "89d73880-ffbb-49c5-9e2d-a49a64c44523-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.127 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "89d73880-ffbb-49c5-9e2d-a49a64c44523-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.127 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Terminating instance
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.127 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.129 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquired lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.129 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.136 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk 1073741824" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.136 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.063s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.137 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.176 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.177 186333 DEBUG nova.virt.disk.api [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Checking if we can resize image /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.177 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.219 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.219 186333 DEBUG nova.virt.disk.api [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Cannot resize image /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.220 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.220 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Ensure instance console log exists: /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.220 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.221 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.221 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.695 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.850 186333 WARNING neutronclient.v2_0.client [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:03 compute-0 nova_compute[186329]: 2025-12-05 06:35:03.980 186333 DEBUG nova.network.neutron [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Updating instance_info_cache with network_info: [{"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.069 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.141 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.141 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquired lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.141 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.141 186333 WARNING neutronclient.v2_0.client [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.235 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.484 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Releasing lock "refresh_cache-534583f6-ef6e-4921-a665-be74a1ebf1ee" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.485 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance network_info: |[{"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2035
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.486 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Start _get_guest_xml network_info=[{"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_format': None, 'encryption_options': None, 'size': 0, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_secret_uuid': None, 'boot_index': 0, 'encrypted': False, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '6903ca06-7f44-4ad2-ab8b-0d16feef7d51'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.489 186333 WARNING nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.490 186333 DEBUG nova.virt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-95184258', uuid='534583f6-ef6e-4921-a665-be74a1ebf1ee'), owner=OwnerMeta(userid='0bd6252356344de684e4bce5dcc3c2d3', username='tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin', projectid='7aae6fc46c7e46a0a68d5efcb8c24f87', projectname='tempest-TestExecuteZoneMigrationStrategy-617599826'), image=ImageMeta(id='6903ca06-7f44-4ad2-ab8b-0d16feef7d51', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='cb13e320-971c-46c2-a935-d695f3631bf8', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20251105112212.710ffbb.el10', creation_time=1764916504.4899938) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.493 186333 DEBUG nova.virt.libvirt.host [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.494 186333 DEBUG nova.virt.libvirt.host [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.497 186333 DEBUG nova.virt.libvirt.host [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.497 186333 DEBUG nova.virt.libvirt.host [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.498 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.498 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-05T06:07:46Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='cb13e320-971c-46c2-a935-d695f3631bf8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-05T06:07:47Z,direct_url=<?>,disk_format='qcow2',id=6903ca06-7f44-4ad2-ab8b-0d16feef7d51,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='fcef582be2274b9ba43451b49b4066ec',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-05T06:07:49Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.499 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.500 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.500 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.500 186333 DEBUG nova.virt.hardware [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.503 186333 DEBUG nova.virt.libvirt.vif [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-95184258',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-95184258',id=27,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7aae6fc46c7e46a0a68d5efcb8c24f87',ramdisk_id='',reservation_id='r-zv5oq78d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-617599826',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:35:02Z,user_data=None,user_id='0bd6252356344de684e4bce5dcc3c2d3',uuid=534583f6-ef6e-4921-a665-be74a1ebf1ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.503 186333 DEBUG nova.network.os_vif_util [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converting VIF {"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.503 186333 DEBUG nova.network.os_vif_util [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.504 186333 DEBUG nova.objects.instance [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lazy-loading 'pci_devices' on Instance uuid 534583f6-ef6e-4921-a665-be74a1ebf1ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.671 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.767 186333 DEBUG nova.network.neutron [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.768 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Releasing lock "refresh_cache-89d73880-ffbb-49c5-9e2d-a49a64c44523" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.768 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:35:04 compute-0 kernel: tap66b62d9b-f2 (unregistering): left promiscuous mode
Dec 05 06:35:04 compute-0 NetworkManager[55434]: <info>  [1764916504.7942] device (tap66b62d9b-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.796 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:04 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.803 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:04 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 05 06:35:04 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000e.scope: Consumed 16.425s CPU time.
Dec 05 06:35:04 compute-0 systemd-machined[152967]: Machine qemu-12-instance-0000000e terminated.
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:04.999 186333 INFO nova.virt.libvirt.driver [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Instance destroyed successfully.
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.000 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'numa_topology' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.009 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] End _get_guest_xml xml=<domain type="kvm">
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <uuid>534583f6-ef6e-4921-a665-be74a1ebf1ee</uuid>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <name>instance-0000001b</name>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <memory>131072</memory>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <vcpu>1</vcpu>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <metadata>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:package version="32.1.0-0.20251105112212.710ffbb.el10"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-95184258</nova:name>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:creationTime>2025-12-05 06:35:04</nova:creationTime>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:flavor name="m1.nano" id="cb13e320-971c-46c2-a935-d695f3631bf8">
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:memory>128</nova:memory>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:disk>1</nova:disk>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:swap>0</nova:swap>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:ephemeral>0</nova:ephemeral>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:vcpus>1</nova:vcpus>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:extraSpecs>
Dec 05 06:35:05 compute-0 nova_compute[186329]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         </nova:extraSpecs>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       </nova:flavor>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:image uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51">
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:containerFormat>bare</nova:containerFormat>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:diskFormat>qcow2</nova:diskFormat>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:minDisk>1</nova:minDisk>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:minRam>0</nova:minRam>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:properties>
Dec 05 06:35:05 compute-0 nova_compute[186329]:           <nova:property name="hw_rng_model">virtio</nova:property>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         </nova:properties>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       </nova:image>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:owner>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:user uuid="0bd6252356344de684e4bce5dcc3c2d3">tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin</nova:user>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:project uuid="7aae6fc46c7e46a0a68d5efcb8c24f87">tempest-TestExecuteZoneMigrationStrategy-617599826</nova:project>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       </nova:owner>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:root type="image" uuid="6903ca06-7f44-4ad2-ab8b-0d16feef7d51"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <nova:ports>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         <nova:port uuid="93d39487-d3d7-4773-9646-52ea7199e328">
Dec 05 06:35:05 compute-0 nova_compute[186329]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:         </nova:port>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       </nova:ports>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </nova:instance>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </metadata>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <sysinfo type="smbios">
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <system>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="manufacturer">RDO</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="product">OpenStack Compute</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="version">32.1.0-0.20251105112212.710ffbb.el10</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="serial">534583f6-ef6e-4921-a665-be74a1ebf1ee</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="uuid">534583f6-ef6e-4921-a665-be74a1ebf1ee</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <entry name="family">Virtual Machine</entry>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </system>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </sysinfo>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <os>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <boot dev="hd"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <smbios mode="sysinfo"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </os>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <features>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <acpi/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <apic/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <vmcoreinfo/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </features>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <clock offset="utc">
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <timer name="pit" tickpolicy="delay"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <timer name="hpet" present="no"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </clock>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <cpu mode="custom" match="exact">
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <model>Nehalem</model>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <topology sockets="1" cores="1" threads="1"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </cpu>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   <devices>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <disk type="file" device="disk">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <target dev="vda" bus="virtio"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <disk type="file" device="cdrom">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <driver name="qemu" type="raw" cache="none"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <source file="/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.config"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <target dev="sda" bus="sata"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </disk>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <interface type="ethernet">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <mac address="fa:16:3e:44:92:0f"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <driver name="vhost" rx_queue_size="512"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <mtu size="1442"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <target dev="tap93d39487-d3"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </interface>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <serial type="pty">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <log file="/var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/console.log" append="off"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </serial>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <video>
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <model type="virtio"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </video>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <input type="tablet" bus="usb"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <rng model="virtio">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <backend model="random">/dev/urandom</backend>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </rng>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="pci" model="pcie-root-port"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <controller type="usb" index="0"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Dec 05 06:35:05 compute-0 nova_compute[186329]:       <stats period="10"/>
Dec 05 06:35:05 compute-0 nova_compute[186329]:     </memballoon>
Dec 05 06:35:05 compute-0 nova_compute[186329]:   </devices>
Dec 05 06:35:05 compute-0 nova_compute[186329]: </domain>
Dec 05 06:35:05 compute-0 nova_compute[186329]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.009 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Preparing to wait for external event network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:308
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.010 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.010 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.010 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.010 186333 DEBUG nova.virt.libvirt.vif [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-12-05T06:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-95184258',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-95184258',id=27,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7aae6fc46c7e46a0a68d5efcb8c24f87',ramdisk_id='',reservation_id='r-zv5oq78d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-617599826',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:35:02Z,user_data=None,user_id='0bd6252356344de684e4bce5dcc3c2d3',uuid=534583f6-ef6e-4921-a665-be74a1ebf1ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.011 186333 DEBUG nova.network.os_vif_util [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converting VIF {"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.011 186333 DEBUG nova.network.os_vif_util [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.011 186333 DEBUG os_vif [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.012 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.012 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.012 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.013 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.013 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e59e1403-4f5c-505b-8e5f-c10e860beb32', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.015 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.017 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.020 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.022 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.022 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93d39487-d3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.023 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap93d39487-d3, col_values=(('qos', UUID('55538c76-513b-4c26-84ea-5844573052eb')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.023 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap93d39487-d3, col_values=(('external_ids', {'iface-id': '93d39487-d3d7-4773-9646-52ea7199e328', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:92:0f', 'vm-uuid': '534583f6-ef6e-4921-a665-be74a1ebf1ee'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.024 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 NetworkManager[55434]: <info>  [1764916505.0245] manager: (tap93d39487-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.025 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.031 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.032 186333 INFO os_vif [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3')
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.504 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<89d73880-ffbb-49c5-9e2d-a49a64c44523> lazy-loaded attributes: info_cache,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:05 compute-0 nova_compute[186329]: 2025-12-05 06:35:05.505 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'resources' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.008 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<89d73880-ffbb-49c5-9e2d-a49a64c44523> lazy-loaded attributes: info_cache,numa_topology,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.009 186333 DEBUG nova.objects.instance [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lazy-loading 'system_metadata' on Instance uuid 89d73880-ffbb-49c5-9e2d-a49a64c44523 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.513 186333 DEBUG nova.objects.base [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Object Instance<89d73880-ffbb-49c5-9e2d-a49a64c44523> lazy-loaded attributes: info_cache,numa_topology,resources,system_metadata wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.514 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Deleting instance files /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523_del
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.514 186333 INFO nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Deletion of /var/lib/nova/instances/89d73880-ffbb-49c5-9e2d-a49a64c44523_del complete
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.560 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.560 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.560 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] No VIF found with MAC fa:16:3e:44:92:0f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Dec 05 06:35:06 compute-0 nova_compute[186329]: 2025-12-05 06:35:06.561 186333 INFO nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Using config drive
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.021 186333 INFO nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Took 2.25 seconds to destroy the instance on the hypervisor.
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.021 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.022 186333 DEBUG nova.compute.manager [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.022 186333 DEBUG nova.network.neutron [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.022 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.066 186333 WARNING neutronclient.v2_0.client [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.667 186333 DEBUG nova.network.neutron [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.667 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.731 186333 INFO nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Creating config drive at /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.config
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.736 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpalnl4hjh execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.852 186333 DEBUG oslo_concurrency.processutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20251105112212.710ffbb.el10 -quiet -J -r -V config-2 /tmp/tmpalnl4hjh" returned: 0 in 0.117s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:07 compute-0 NetworkManager[55434]: <info>  [1764916507.8845] manager: (tap93d39487-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Dec 05 06:35:07 compute-0 kernel: tap93d39487-d3: entered promiscuous mode
Dec 05 06:35:07 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.895 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:07 compute-0 ovn_controller[95223]: 2025-12-05T06:35:07Z|00225|binding|INFO|Claiming lport 93d39487-d3d7-4773-9646-52ea7199e328 for this chassis.
Dec 05 06:35:07 compute-0 ovn_controller[95223]: 2025-12-05T06:35:07Z|00226|binding|INFO|93d39487-d3d7-4773-9646-52ea7199e328: Claiming fa:16:3e:44:92:0f 10.100.0.8
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.906 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:92:0f 10.100.0.8'], port_security=['fa:16:3e:44:92:0f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '534583f6-ef6e-4921-a665-be74a1ebf1ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '4', 'neutron:security_group_ids': '79382e11-3430-4326-910e-567b5a1dc769', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85adff9c-df19-421b-8711-6ce155f4dd7c, chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=93d39487-d3d7-4773-9646-52ea7199e328) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.906 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 93d39487-d3d7-4773-9646-52ea7199e328 in datapath 12281049-d2b1-40ef-9535-ec69961f84f0 bound to our chassis
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.907 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:07 compute-0 systemd-udevd[215673]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.915 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[769bfdf3-987d-4062-9cef-67b8f518153e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.915 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap12281049-d1 in ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.917 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap12281049-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.917 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[578507d2-3faf-4698-9cd7-8d2c3f8c3ef8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.917 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[306ec7a1-09d3-4c51-b94e-81cea3dce959]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 systemd-machined[152967]: New machine qemu-21-instance-0000001b.
Dec 05 06:35:07 compute-0 NetworkManager[55434]: <info>  [1764916507.9223] device (tap93d39487-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:35:07 compute-0 NetworkManager[55434]: <info>  [1764916507.9229] device (tap93d39487-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.928 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[7ada2567-6ed1-4913-9704-f7494af85559]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001b.
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.946 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dadbb119-e681-441c-873f-4aafaf9a4a5e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.966 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[62400139-0c92-4c70-9ca8-fb92f31c29e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:07 compute-0 systemd-udevd[215676]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:35:07 compute-0 NetworkManager[55434]: <info>  [1764916507.9871] manager: (tap12281049-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Dec 05 06:35:07 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:07.986 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[ee30890d-9b68-4bc2-a7c1-52c2bd9e8c5a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:07.999 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 ovn_controller[95223]: 2025-12-05T06:35:08Z|00227|binding|INFO|Setting lport 93d39487-d3d7-4773-9646-52ea7199e328 ovn-installed in OVS
Dec 05 06:35:08 compute-0 ovn_controller[95223]: 2025-12-05T06:35:08Z|00228|binding|INFO|Setting lport 93d39487-d3d7-4773-9646-52ea7199e328 up in Southbound
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.005 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.010 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[35363d11-d589-4332-83af-541452b28cb8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.012 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c3ada9-3530-487a-a840-7f3939bedbcb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 NetworkManager[55434]: <info>  [1764916508.0285] device (tap12281049-d0): carrier: link connected
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.031 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[c38075a2-4e62-435d-9992-f526308fc919]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.043 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[562ef673-f030-42e3-9969-8c0926b2f3f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12281049-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:a3:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438656, 'reachable_time': 43220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215706, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.054 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[14f2f7e5-ac0e-4123-94e2-6956c16b977a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:a31b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438656, 'tstamp': 438656}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215707, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.066 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3630510e-6385-4c43-b2fe-5c7c63529a18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12281049-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:a3:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438656, 'reachable_time': 43220, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215708, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.086 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7997032f-4a83-4a56-a79b-c49b62e4edf3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.123 186333 DEBUG nova.compute.manager [req-aa575231-fd8e-451e-a7f2-342d00ee9a16 req-39851dd1-249e-4e99-ad6c-1b4c703211e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.123 186333 DEBUG oslo_concurrency.lockutils [req-aa575231-fd8e-451e-a7f2-342d00ee9a16 req-39851dd1-249e-4e99-ad6c-1b4c703211e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.124 186333 DEBUG oslo_concurrency.lockutils [req-aa575231-fd8e-451e-a7f2-342d00ee9a16 req-39851dd1-249e-4e99-ad6c-1b4c703211e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.124 186333 DEBUG oslo_concurrency.lockutils [req-aa575231-fd8e-451e-a7f2-342d00ee9a16 req-39851dd1-249e-4e99-ad6c-1b4c703211e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.124 186333 DEBUG nova.compute.manager [req-aa575231-fd8e-451e-a7f2-342d00ee9a16 req-39851dd1-249e-4e99-ad6c-1b4c703211e2 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Processing event network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11576
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.125 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[42019ec4-a7cb-4db8-a1df-d95b289a3918]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.126 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12281049-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.126 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.126 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12281049-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.128 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 NetworkManager[55434]: <info>  [1764916508.1283] manager: (tap12281049-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Dec 05 06:35:08 compute-0 kernel: tap12281049-d0: entered promiscuous mode
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.135 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.136 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap12281049-d0, col_values=(('external_ids', {'iface-id': 'cc9bed08-ff40-4708-a7bc-12e84332e3cc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:08 compute-0 ovn_controller[95223]: 2025-12-05T06:35:08Z|00229|binding|INFO|Releasing lport cc9bed08-ff40-4708-a7bc-12e84332e3cc from this chassis (sb_readonly=0)
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.137 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.159 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.160 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[14258fd4-7701-41f8-be60-8a49f44b6a81]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.160 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.160 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.160 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 12281049-d2b1-40ef-9535-ec69961f84f0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.160 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.161 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[61261fb2-e5cb-40ef-98cb-fa10fd15570d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.161 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.161 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[2464d065-79dd-4594-a552-7d1d76de8d06]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.161 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:35:08 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:08.162 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'env', 'PROCESS_TAG=haproxy-12281049-d2b1-40ef-9535-ec69961f84f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/12281049-d2b1-40ef-9535-ec69961f84f0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.171 186333 DEBUG nova.network.neutron [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:106
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.171 186333 INFO nova.compute.manager [-] [instance: 89d73880-ffbb-49c5-9e2d-a49a64c44523] Took 1.15 seconds to deallocate network for instance.
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.172 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.370 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:603
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.373 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.375 186333 INFO nova.virt.libvirt.driver [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance spawned successfully.
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.375 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Dec 05 06:35:08 compute-0 podman[215759]: 2025-12-05 06:35:08.463045172 +0000 UTC m=+0.029313774 container create e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 05 06:35:08 compute-0 systemd[1]: Started libpod-conmon-e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b.scope.
Dec 05 06:35:08 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:35:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3e0b42013c85d7dbb3f17ad87fbcbe517b3c4a6debbc8fec68648f22fcd4825/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:35:08 compute-0 podman[215759]: 2025-12-05 06:35:08.52003564 +0000 UTC m=+0.086304252 container init e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:35:08 compute-0 podman[215759]: 2025-12-05 06:35:08.525211461 +0000 UTC m=+0.091480063 container start e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:35:08 compute-0 podman[215759]: 2025-12-05 06:35:08.449775402 +0000 UTC m=+0.016044024 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:35:08 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [NOTICE]   (215775) : New worker (215777) forked
Dec 05 06:35:08 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [NOTICE]   (215775) : Loading success.
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.884 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.884 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.885 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.885 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.886 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:08 compute-0 nova_compute[186329]: 2025-12-05 06:35:08.886 186333 DEBUG nova.virt.libvirt.driver [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Dec 05 06:35:09 compute-0 nova_compute[186329]: 2025-12-05 06:35:09.071 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:09 compute-0 nova_compute[186329]: 2025-12-05 06:35:09.394 186333 INFO nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Took 6.37 seconds to spawn the instance on the hypervisor.
Dec 05 06:35:09 compute-0 nova_compute[186329]: 2025-12-05 06:35:09.395 186333 DEBUG nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:35:09 compute-0 nova_compute[186329]: 2025-12-05 06:35:09.919 186333 INFO nova.compute.manager [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Took 11.51 seconds to build instance.
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.024 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.170 186333 DEBUG nova.compute.manager [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.170 186333 DEBUG oslo_concurrency.lockutils [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.171 186333 DEBUG oslo_concurrency.lockutils [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.171 186333 DEBUG oslo_concurrency.lockutils [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.171 186333 DEBUG nova.compute.manager [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] No waiting events found dispatching network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.171 186333 WARNING nova.compute.manager [req-fd105a21-efa8-4ecf-8b5b-d527547dac54 req-ee76d8d6-65f5-4cb2-8381-0611303e8dde fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received unexpected event network-vif-plugged-93d39487-d3d7-4773-9646-52ea7199e328 for instance with vm_state active and task_state None.
Dec 05 06:35:10 compute-0 nova_compute[186329]: 2025-12-05 06:35:10.422 186333 DEBUG oslo_concurrency.lockutils [None req-f8e46589-2fa7-4b06-b7d4-3745dcf9a783 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.018s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:14 compute-0 nova_compute[186329]: 2025-12-05 06:35:14.072 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:15 compute-0 nova_compute[186329]: 2025-12-05 06:35:15.026 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:16 compute-0 podman[215783]: 2025-12-05 06:35:16.468190281 +0000 UTC m=+0.044886903 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:35:16 compute-0 podman[215782]: 2025-12-05 06:35:16.490600976 +0000 UTC m=+0.068829699 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.213 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.213 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.720 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.720 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.720 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.721 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.721 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 nova_compute[186329]: 2025-12-05 06:35:17.721 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:17 compute-0 rsyslogd[961]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.229 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Triggering sync for uuid 534583f6-ef6e-4921-a665-be74a1ebf1ee _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11024
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.229 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.229 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.229 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.229 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.230 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.735 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.506s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.742 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.742 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.743 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:18 compute-0 nova_compute[186329]: 2025-12-05 06:35:18.743 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:35:19 compute-0 nova_compute[186329]: 2025-12-05 06:35:19.073 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:19 compute-0 ovn_controller[95223]: 2025-12-05T06:35:19Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:92:0f 10.100.0.8
Dec 05 06:35:19 compute-0 ovn_controller[95223]: 2025-12-05T06:35:19Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:92:0f 10.100.0.8
Dec 05 06:35:19 compute-0 nova_compute[186329]: 2025-12-05 06:35:19.769 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:19 compute-0 nova_compute[186329]: 2025-12-05 06:35:19.824 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:19 compute-0 nova_compute[186329]: 2025-12-05 06:35:19.825 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:19 compute-0 nova_compute[186329]: 2025-12-05 06:35:19.879 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.027 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.072 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.073 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.088 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.088 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.13523483276367GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.088 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.089 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.706 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Creating tmpfile /var/lib/nova/instances/tmpm1vpb62b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.707 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:20 compute-0 nova_compute[186329]: 2025-12-05 06:35:20.709 186333 DEBUG nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm1vpb62b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9090
Dec 05 06:35:21 compute-0 nova_compute[186329]: 2025-12-05 06:35:21.620 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:35:21 compute-0 nova_compute[186329]: 2025-12-05 06:35:21.620 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 534583f6-ef6e-4921-a665-be74a1ebf1ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.127 186333 WARNING nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance d8184114-bc46-46e9-a8b6-2bedf31a51d9 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.127 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.128 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=4 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:35:20 up  1:13,  0 user,  load average: 0.20, 0.10, 0.16\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_7aae6fc46c7e46a0a68d5efcb8c24f87': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.188 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.692 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:35:22 compute-0 nova_compute[186329]: 2025-12-05 06:35:22.744 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.198 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.198 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.199 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.199 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.703 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:35:23 compute-0 nova_compute[186329]: 2025-12-05 06:35:23.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:35:24 compute-0 nova_compute[186329]: 2025-12-05 06:35:24.074 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:25 compute-0 nova_compute[186329]: 2025-12-05 06:35:25.029 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:25 compute-0 podman[215850]: 2025-12-05 06:35:25.470002075 +0000 UTC m=+0.046454370 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 06:35:25 compute-0 podman[215849]: 2025-12-05 06:35:25.473517996 +0000 UTC m=+0.051717386 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Dec 05 06:35:25 compute-0 podman[215848]: 2025-12-05 06:35:25.490499844 +0000 UTC m=+0.070856119 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202)
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.217 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.217 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.217 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.218 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.218 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.225 186333 INFO nova.compute.manager [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Terminating instance
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.734 186333 DEBUG nova.compute.manager [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3201
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.737 186333 DEBUG nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm1vpb62b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8184114-bc46-46e9-a8b6-2bedf31a51d9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9315
Dec 05 06:35:26 compute-0 kernel: tap93d39487-d3 (unregistering): left promiscuous mode
Dec 05 06:35:26 compute-0 NetworkManager[55434]: <info>  [1764916526.7564] device (tap93d39487-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.761 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 ovn_controller[95223]: 2025-12-05T06:35:26Z|00230|binding|INFO|Releasing lport 93d39487-d3d7-4773-9646-52ea7199e328 from this chassis (sb_readonly=0)
Dec 05 06:35:26 compute-0 ovn_controller[95223]: 2025-12-05T06:35:26Z|00231|binding|INFO|Setting lport 93d39487-d3d7-4773-9646-52ea7199e328 down in Southbound
Dec 05 06:35:26 compute-0 ovn_controller[95223]: 2025-12-05T06:35:26Z|00232|binding|INFO|Removing iface tap93d39487-d3 ovn-installed in OVS
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.776 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.783 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:92:0f 10.100.0.8'], port_security=['fa:16:3e:44:92:0f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '534583f6-ef6e-4921-a665-be74a1ebf1ee', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '5', 'neutron:security_group_ids': '79382e11-3430-4326-910e-567b5a1dc769', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85adff9c-df19-421b-8711-6ce155f4dd7c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=93d39487-d3d7-4773-9646-52ea7199e328) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.784 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 93d39487-d3d7-4773-9646-52ea7199e328 in datapath 12281049-d2b1-40ef-9535-ec69961f84f0 unbound from our chassis
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.785 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 12281049-d2b1-40ef-9535-ec69961f84f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.786 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[e79c35aa-15e6-4a23-b363-7a9e9797ab5a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.786 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 namespace which is not needed anymore
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.787 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec 05 06:35:26 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Consumed 11.335s CPU time.
Dec 05 06:35:26 compute-0 systemd-machined[152967]: Machine qemu-21-instance-0000001b terminated.
Dec 05 06:35:26 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [NOTICE]   (215775) : haproxy version is 3.0.5-8e879a5
Dec 05 06:35:26 compute-0 podman[215927]: 2025-12-05 06:35:26.861337129 +0000 UTC m=+0.019741848 container kill e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:35:26 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [NOTICE]   (215775) : path to executable is /usr/sbin/haproxy
Dec 05 06:35:26 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [WARNING]  (215775) : Exiting Master process...
Dec 05 06:35:26 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [ALERT]    (215775) : Current worker (215777) exited with code 143 (Terminated)
Dec 05 06:35:26 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[215771]: [WARNING]  (215775) : All workers exited. Exiting... (0)
Dec 05 06:35:26 compute-0 systemd[1]: libpod-e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b.scope: Deactivated successfully.
Dec 05 06:35:26 compute-0 podman[215940]: 2025-12-05 06:35:26.896208722 +0000 UTC m=+0.018094671 container died e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 05 06:35:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b-userdata-shm.mount: Deactivated successfully.
Dec 05 06:35:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-b3e0b42013c85d7dbb3f17ad87fbcbe517b3c4a6debbc8fec68648f22fcd4825-merged.mount: Deactivated successfully.
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.913 186333 DEBUG nova.compute.manager [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.913 186333 DEBUG oslo_concurrency.lockutils [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.914 186333 DEBUG oslo_concurrency.lockutils [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.914 186333 DEBUG oslo_concurrency.lockutils [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.914 186333 DEBUG nova.compute.manager [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] No waiting events found dispatching network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.914 186333 DEBUG nova.compute.manager [req-2f5b7c0c-770c-4897-aab4-64ee400faf68 req-2610690c-f6d3-4883-b62a-d58246e09c21 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:35:26 compute-0 podman[215940]: 2025-12-05 06:35:26.91586098 +0000 UTC m=+0.037746930 container cleanup e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 05 06:35:26 compute-0 systemd[1]: libpod-conmon-e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b.scope: Deactivated successfully.
Dec 05 06:35:26 compute-0 podman[215941]: 2025-12-05 06:35:26.927281392 +0000 UTC m=+0.047917772 container remove e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.930 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0053aa6f-1af6-4124-9c39-72c8d884ed3d]: (4, ("Fri Dec  5 06:35:26 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 (e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b)\ne021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b\nFri Dec  5 06:35:26 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 (e021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b)\ne021a63f54ea9cc58cc0f85e43ff5c81db5782cca6ca7ee261f2e2317d4ede5b\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.931 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[14044c00-9de3-4922-871b-25d13b7e5c1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.931 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.932 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[6411e312-e2c3-4826-836b-51ffc84dedcd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.932 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12281049-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.933 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.946 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 kernel: tap12281049-d0: left promiscuous mode
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.958 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.960 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[7d8ea276-7300-43dd-b2c3-db6f2db8f69f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.967 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[34acf29c-d9d4-4227-9c01-e079268b702b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.968 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[98fc5902-4493-4935-abf5-f5e5a4aa55c3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.977 186333 INFO nova.virt.libvirt.driver [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Instance destroyed successfully.
Dec 05 06:35:26 compute-0 nova_compute[186329]: 2025-12-05 06:35:26.977 186333 DEBUG nova.objects.instance [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lazy-loading 'resources' on Instance uuid 534583f6-ef6e-4921-a665-be74a1ebf1ee obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.985 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f93844-79da-4826-80df-d5efcde05843]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438649, 'reachable_time': 26672, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215987, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d12281049\x2dd2b1\x2d40ef\x2d9535\x2dec69961f84f0.mount: Deactivated successfully.
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.988 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:35:26 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:26.988 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[fb6eb4e6-09e5-4f8c-aba9-62378836a570]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.481 186333 DEBUG nova.virt.libvirt.vif [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-12-05T06:34:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-95184258',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-95184258',id=27,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:35:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7aae6fc46c7e46a0a68d5efcb8c24f87',ramdisk_id='',reservation_id='r-zv5oq78d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-617599826',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-05T06:35:09Z,user_data=None,user_id='0bd6252356344de684e4bce5dcc3c2d3',uuid=534583f6-ef6e-4921-a665-be74a1ebf1ee,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.481 186333 DEBUG nova.network.os_vif_util [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converting VIF {"id": "93d39487-d3d7-4773-9646-52ea7199e328", "address": "fa:16:3e:44:92:0f", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93d39487-d3", "ovs_interfaceid": "93d39487-d3d7-4773-9646-52ea7199e328", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.482 186333 DEBUG nova.network.os_vif_util [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.482 186333 DEBUG os_vif [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.483 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.483 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93d39487-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.484 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.486 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.491 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.492 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.493 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=55538c76-513b-4c26-84ea-5844573052eb) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.493 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.495 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.499 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.500 186333 INFO os_vif [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:92:0f,bridge_name='br-int',has_traffic_filtering=True,id=93d39487-d3d7-4773-9646-52ea7199e328,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93d39487-d3')
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.500 186333 INFO nova.virt.libvirt.driver [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Deleting instance files /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee_del
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.501 186333 INFO nova.virt.libvirt.driver [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Deletion of /var/lib/nova/instances/534583f6-ef6e-4921-a665-be74a1ebf1ee_del complete
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.746 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.747 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:35:27 compute-0 nova_compute[186329]: 2025-12-05 06:35:27.747 186333 DEBUG nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.009 186333 INFO nova.compute.manager [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Took 1.28 seconds to destroy the instance on the hypervisor.
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.010 186333 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.010 186333 DEBUG nova.compute.manager [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2328
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.010 186333 DEBUG nova.network.neutron [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.010 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.256 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.727 186333 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.971 186333 DEBUG nova.compute.manager [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.972 186333 DEBUG oslo_concurrency.lockutils [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.972 186333 DEBUG oslo_concurrency.lockutils [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.972 186333 DEBUG oslo_concurrency.lockutils [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.972 186333 DEBUG nova.compute.manager [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] No waiting events found dispatching network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:345
Dec 05 06:35:28 compute-0 nova_compute[186329]: 2025-12-05 06:35:28.972 186333 DEBUG nova.compute.manager [req-8d3753d5-24ac-43f8-802a-3ff040aac1a2 req-f830defc-d2ac-43bf-9fa1-f65e2315d16a fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-unplugged-93d39487-d3d7-4773-9646-52ea7199e328 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11594
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.011 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.075 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.125 186333 DEBUG nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Updating instance_info_cache with network_info: [{"id": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "address": "fa:16:3e:ec:9e:04", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f310d80-5c", "ovs_interfaceid": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.483 186333 DEBUG nova.network.neutron [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:35:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:29.523 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:29.524 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:29.524 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.629 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.636 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm1vpb62b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8184114-bc46-46e9-a8b6-2bedf31a51d9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.637 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Creating instance directory: /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.637 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Creating disk.info with the contents: {'/var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk': 'qcow2', '/var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.637 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.638 186333 DEBUG nova.objects.instance [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d8184114-bc46-46e9-a8b6-2bedf31a51d9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:29 compute-0 podman[196599]: time="2025-12-05T06:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:35:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:35:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 05 06:35:29 compute-0 nova_compute[186329]: 2025-12-05 06:35:29.988 186333 INFO nova.compute.manager [-] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Took 1.98 seconds to deallocate network for instance.
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.141 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.144 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.145 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.187 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.188 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.188 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.189 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.191 186333 DEBUG oslo_utils.imageutils.format_inspector [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.192 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.233 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.234 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.256 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b,backing_fmt=raw /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.257 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "e64c6ab6611a8a325f2a8a7889bad0cf22363e3b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.257 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.299 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e64c6ab6611a8a325f2a8a7889bad0cf22363e3b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.300 186333 DEBUG nova.virt.disk.api [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Checking if we can resize image /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.300 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.342 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.343 186333 DEBUG nova.virt.disk.api [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Cannot resize image /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.343 186333 DEBUG nova.objects.instance [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lazy-loading 'migration_context' on Instance uuid d8184114-bc46-46e9-a8b6-2bedf31a51d9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.503 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.503 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.578 186333 DEBUG nova.compute.provider_tree [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.847 186333 DEBUG nova.objects.base [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Object Instance<d8184114-bc46-46e9-a8b6-2bedf31a51d9> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.848 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.863 186333 DEBUG oslo_concurrency.processutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d8184114-bc46-46e9-a8b6-2bedf31a51d9/disk.config 497664" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.864 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.865 186333 DEBUG nova.virt.libvirt.vif [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-12-05T06:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1553891179',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1553891179',id=26,image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-05T06:34:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7aae6fc46c7e46a0a68d5efcb8c24f87',ramdisk_id='',reservation_id='r-y145khow',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='manager,member,admin,reader',image_base_image_ref='6903ca06-7f44-4ad2-ab8b-0d16feef7d51',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-617599826',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-617599826-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-05T06:34:54Z,user_data=None,user_id='0bd6252356344de684e4bce5dcc3c2d3',uuid=d8184114-bc46-46e9-a8b6-2bedf31a51d9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "address": "fa:16:3e:ec:9e:04", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1f310d80-5c", "ovs_interfaceid": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.865 186333 DEBUG nova.network.os_vif_util [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converting VIF {"id": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "address": "fa:16:3e:ec:9e:04", "network": {"id": "12281049-d2b1-40ef-9535-ec69961f84f0", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1145425560-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96f56ecf15c14a5f9dc32756b35ebccc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1f310d80-5c", "ovs_interfaceid": "1f310d80-5c1c-45b1-8d19-1bf63071749b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.866 186333 DEBUG nova.network.os_vif_util [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:9e:04,bridge_name='br-int',has_traffic_filtering=True,id=1f310d80-5c1c-45b1-8d19-1bf63071749b,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f310d80-5c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.866 186333 DEBUG os_vif [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:9e:04,bridge_name='br-int',has_traffic_filtering=True,id=1f310d80-5c1c-45b1-8d19-1bf63071749b,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f310d80-5c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.867 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.867 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.867 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.868 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.868 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ba2075c3-a457-525a-a551-6b8662049e52', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.869 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.871 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.876 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.878 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.878 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f310d80-5c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.878 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1f310d80-5c, col_values=(('qos', UUID('ecc1d1ba-21bf-46e4-a833-9a3a3137ff07')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.878 186333 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1f310d80-5c, col_values=(('external_ids', {'iface-id': '1f310d80-5c1c-45b1-8d19-1bf63071749b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:9e:04', 'vm-uuid': 'd8184114-bc46-46e9-a8b6-2bedf31a51d9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.879 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 NetworkManager[55434]: <info>  [1764916530.8799] manager: (tap1f310d80-5c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.881 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.887 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.887 186333 INFO os_vif [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:9e:04,bridge_name='br-int',has_traffic_filtering=True,id=1f310d80-5c1c-45b1-8d19-1bf63071749b,network=Network(12281049-d2b1-40ef-9535-ec69961f84f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f310d80-5c')
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.887 186333 DEBUG nova.virt.libvirt.driver [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.888 186333 DEBUG nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm1vpb62b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8184114-bc46-46e9-a8b6-2bedf31a51d9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9381
Dec 05 06:35:30 compute-0 nova_compute[186329]: 2025-12-05 06:35:30.888 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:31 compute-0 nova_compute[186329]: 2025-12-05 06:35:31.005 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:31 compute-0 nova_compute[186329]: 2025-12-05 06:35:31.017 186333 DEBUG nova.compute.manager [req-c4da37db-1411-4762-ab90-2072b82af7e3 req-a959a85f-c4f7-4fb9-bba7-80e7753b3055 fa90f49e115d4cfca175c8a08258b852 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: 534583f6-ef6e-4921-a665-be74a1ebf1ee] Received event network-vif-deleted-93d39487-d3d7-4773-9646-52ea7199e328 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11816
Dec 05 06:35:31 compute-0 nova_compute[186329]: 2025-12-05 06:35:31.083 186333 DEBUG nova.scheduler.client.report [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: ERROR   06:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: ERROR   06:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: ERROR   06:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: ERROR   06:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: ERROR   06:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:35:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:35:31 compute-0 nova_compute[186329]: 2025-12-05 06:35:31.592 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.089s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:31 compute-0 nova_compute[186329]: 2025-12-05 06:35:31.609 186333 INFO nova.scheduler.client.report [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Deleted allocations for instance 534583f6-ef6e-4921-a665-be74a1ebf1ee
Dec 05 06:35:32 compute-0 nova_compute[186329]: 2025-12-05 06:35:32.011 186333 DEBUG nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Port 1f310d80-5c1c-45b1-8d19-1bf63071749b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Dec 05 06:35:32 compute-0 nova_compute[186329]: 2025-12-05 06:35:32.018 186333 DEBUG nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=70656,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm1vpb62b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d8184114-bc46-46e9-a8b6-2bedf31a51d9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9447
Dec 05 06:35:32 compute-0 nova_compute[186329]: 2025-12-05 06:35:32.625 186333 DEBUG oslo_concurrency.lockutils [None req-9bb19d52-446f-4cbb-b985-a2386ce338dd 0bd6252356344de684e4bce5dcc3c2d3 7aae6fc46c7e46a0a68d5efcb8c24f87 - - default default] Lock "534583f6-ef6e-4921-a665-be74a1ebf1ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.408s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.076 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.7146] manager: (tap1f310d80-5c): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Dec 05 06:35:34 compute-0 kernel: tap1f310d80-5c: entered promiscuous mode
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.722 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_controller[95223]: 2025-12-05T06:35:34Z|00233|binding|INFO|Claiming lport 1f310d80-5c1c-45b1-8d19-1bf63071749b for this additional chassis.
Dec 05 06:35:34 compute-0 ovn_controller[95223]: 2025-12-05T06:35:34Z|00234|binding|INFO|1f310d80-5c1c-45b1-8d19-1bf63071749b: Claiming fa:16:3e:ec:9e:04 10.100.0.12
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.727 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:9e:04 10.100.0.12'], port_security=['fa:16:3e:ec:9e:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd8184114-bc46-46e9-a8b6-2bedf31a51d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '10', 'neutron:security_group_ids': '79382e11-3430-4326-910e-567b5a1dc769', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85adff9c-df19-421b-8711-6ce155f4dd7c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=1f310d80-5c1c-45b1-8d19-1bf63071749b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.727 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 1f310d80-5c1c-45b1-8d19-1bf63071749b in datapath 12281049-d2b1-40ef-9535-ec69961f84f0 unbound from our chassis
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.728 104041 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.737 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[c278ca94-22ce-4800-b41a-b353a3ce335e]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.739 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap12281049-d1 in ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Dec 05 06:35:34 compute-0 systemd-udevd[216048]: Network interface NamePolicy= disabled on kernel command line.
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.744 206383 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap12281049-d0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.744 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[61832253-c771-4efd-addb-e3d6f2f2f704]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.744 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_controller[95223]: 2025-12-05T06:35:34Z|00235|binding|INFO|Setting lport 1f310d80-5c1c-45b1-8d19-1bf63071749b ovn-installed in OVS
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.745 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.745 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[49c0080b-c1bc-451c-9fd9-68c8945b0dae]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 systemd-machined[152967]: New machine qemu-22-instance-0000001a.
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.754 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[72512d82-44f5-4f19-97df-ada882355c9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.7584] device (tap1f310d80-5c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.7591] device (tap1f310d80-5c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 05 06:35:34 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001a.
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.768 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[d48398e3-d4e5-4c95-9557-19a4eaa17a51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.791 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[1079d95e-5ef3-421d-9fcb-336825a65d10]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.793 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[78173f03-3317-4f3e-8332-f19c64ad4b3d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.7948] manager: (tap12281049-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.818 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[e957ee80-3159-489f-8660-9d80c489835a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.820 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[72d08d01-869a-488b-a6b5-46dd64c1399f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.8366] device (tap12281049-d0): carrier: link connected
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.841 207374 DEBUG oslo.privsep.daemon [-] privsep: reply[3d948019-e91f-4f2a-bf32-a8b382505a5b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.855 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[59d7f989-52f9-4429-8e8e-71979546cbd8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12281049-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:a3:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441337, 'reachable_time': 32226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216083, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.866 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[574d1441-d769-45d3-94bc-1e7a66a8ecb5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:a31b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 441337, 'tstamp': 441337}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216084, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.879 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[273e79ec-c437-4ef6-93c5-4c808b58e73e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap12281049-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 4], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 4], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:a3:1b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441337, 'reachable_time': 32226, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216085, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.900 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[43970d13-8746-4c72-b9fc-9b745902a47a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.945 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f72979a1-ddb8-45f1-b9e1-3d83ca168511]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.946 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12281049-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.946 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.946 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12281049-d0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.947 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 NetworkManager[55434]: <info>  [1764916534.9484] manager: (tap12281049-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Dec 05 06:35:34 compute-0 kernel: tap12281049-d0: entered promiscuous mode
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.956 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.959 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap12281049-d0, col_values=(('external_ids', {'iface-id': 'cc9bed08-ff40-4708-a7bc-12e84332e3cc'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.960 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_controller[95223]: 2025-12-05T06:35:34Z|00236|binding|INFO|Releasing lport cc9bed08-ff40-4708-a7bc-12e84332e3cc from this chassis (sb_readonly=0)
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.975 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 nova_compute[186329]: 2025-12-05 06:35:34.981 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.982 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[3da423fb-6c4e-44b3-815f-649b52e4a661]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.983 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.983 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.983 104041 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 12281049-d2b1-40ef-9535-ec69961f84f0 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.983 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.983 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f42242a2-accc-463c-9ea9-c712153eddb4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.984 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.984 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[1158d3b2-c48f-460f-a3ec-8ec65599259c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.984 104041 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: global
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     log         /dev/log local0 debug
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     log-tag     haproxy-metadata-proxy-12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     user        root
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     group       root
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     maxconn     1024
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     pidfile     /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     daemon
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: defaults
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     log global
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     mode http
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     option httplog
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     option dontlognull
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     option http-server-close
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     option forwardfor
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     retries                 3
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     timeout http-request    30s
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     timeout connect         30s
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     timeout client          32s
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     timeout server          32s
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     timeout http-keep-alive 30s
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: listen listener
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     bind 169.254.169.254:80
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     server metadata /var/lib/neutron/metadata_proxy
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:     http-request add-header X-OVN-Network-ID 12281049-d2b1-40ef-9535-ec69961f84f0
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Dec 05 06:35:34 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:34.985 104041 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'env', 'PROCESS_TAG=haproxy-12281049-d2b1-40ef-9535-ec69961f84f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/12281049-d2b1-40ef-9535-ec69961f84f0.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Dec 05 06:35:35 compute-0 podman[216135]: 2025-12-05 06:35:35.291168547 +0000 UTC m=+0.030272907 container create 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Dec 05 06:35:35 compute-0 systemd[1]: Started libpod-conmon-6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624.scope.
Dec 05 06:35:35 compute-0 systemd[1]: Started libcrun container.
Dec 05 06:35:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ad757ee314ec577962de0c3aecffd30496b05b2dab248cf7dacf7848e1b3f74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 05 06:35:35 compute-0 podman[216135]: 2025-12-05 06:35:35.347974418 +0000 UTC m=+0.087078788 container init 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0)
Dec 05 06:35:35 compute-0 podman[216135]: 2025-12-05 06:35:35.352293839 +0000 UTC m=+0.091398199 container start 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 05 06:35:35 compute-0 podman[216135]: 2025-12-05 06:35:35.277587843 +0000 UTC m=+0.016692223 image pull 773fbabd60bc1c8e2ad203301a187a593937fa42bcc751aba0971bd96baac0cb quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [NOTICE]   (216151) : New worker (216153) forked
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [NOTICE]   (216151) : Loading success.
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.504 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.505 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.506 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00237|binding|INFO|Claiming lport 1f310d80-5c1c-45b1-8d19-1bf63071749b for this chassis.
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00238|binding|INFO|1f310d80-5c1c-45b1-8d19-1bf63071749b: Claiming fa:16:3e:ec:9e:04 10.100.0.12
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00239|binding|INFO|Setting lport 1f310d80-5c1c-45b1-8d19-1bf63071749b up in Southbound
Dec 05 06:35:35 compute-0 kernel: tap1f310d80-5c (unregistering): left promiscuous mode
Dec 05 06:35:35 compute-0 NetworkManager[55434]: <info>  [1764916535.7426] device (tap1f310d80-5c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00240|binding|INFO|Releasing lport 1f310d80-5c1c-45b1-8d19-1bf63071749b from this chassis (sb_readonly=0)
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00241|binding|INFO|Setting lport 1f310d80-5c1c-45b1-8d19-1bf63071749b down in Southbound
Dec 05 06:35:35 compute-0 ovn_controller[95223]: 2025-12-05T06:35:35Z|00242|binding|INFO|Removing iface tap1f310d80-5c ovn-installed in OVS
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.753 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.756 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:9e:04 10.100.0.12'], port_security=['fa:16:3e:ec:9e:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd8184114-bc46-46e9-a8b6-2bedf31a51d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-12281049-d2b1-40ef-9535-ec69961f84f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aae6fc46c7e46a0a68d5efcb8c24f87', 'neutron:revision_number': '15', 'neutron:security_group_ids': '79382e11-3430-4326-910e-567b5a1dc769', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85adff9c-df19-421b-8711-6ce155f4dd7c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>], logical_port=1f310d80-5c1c-45b1-8d19-1bf63071749b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fc7dd796630>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.756 104041 INFO neutron.agent.ovn.metadata.agent [-] Port 1f310d80-5c1c-45b1-8d19-1bf63071749b in datapath 12281049-d2b1-40ef-9535-ec69961f84f0 unbound from our chassis
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.757 104041 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 12281049-d2b1-40ef-9535-ec69961f84f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.757 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[f81c42ae-b220-46f7-a420-71ca49ec1456]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.758 104041 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 namespace which is not needed anymore
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.773 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 05 06:35:35 compute-0 systemd-machined[152967]: Machine qemu-22-instance-0000001a terminated.
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [NOTICE]   (216151) : haproxy version is 3.0.5-8e879a5
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [NOTICE]   (216151) : path to executable is /usr/sbin/haproxy
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [WARNING]  (216151) : Exiting Master process...
Dec 05 06:35:35 compute-0 podman[216183]: 2025-12-05 06:35:35.840370377 +0000 UTC m=+0.022434000 container kill 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [ALERT]    (216151) : Current worker (216153) exited with code 143 (Terminated)
Dec 05 06:35:35 compute-0 neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0[216147]: [WARNING]  (216151) : All workers exited. Exiting... (0)
Dec 05 06:35:35 compute-0 systemd[1]: libpod-6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624.scope: Deactivated successfully.
Dec 05 06:35:35 compute-0 podman[216196]: 2025-12-05 06:35:35.870911247 +0000 UTC m=+0.016787583 container died 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.880 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624-userdata-shm.mount: Deactivated successfully.
Dec 05 06:35:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ad757ee314ec577962de0c3aecffd30496b05b2dab248cf7dacf7848e1b3f74-merged.mount: Deactivated successfully.
Dec 05 06:35:35 compute-0 podman[216196]: 2025-12-05 06:35:35.889203389 +0000 UTC m=+0.035079725 container cleanup 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Dec 05 06:35:35 compute-0 systemd[1]: libpod-conmon-6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624.scope: Deactivated successfully.
Dec 05 06:35:35 compute-0 podman[216197]: 2025-12-05 06:35:35.901670658 +0000 UTC m=+0.044457706 container remove 6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.904 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[59197690-bbb5-4e96-b677-62ed0a3af6bd]: (4, ("Fri Dec  5 06:35:35 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 (6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624)\n6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624\nFri Dec  5 06:35:35 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 (6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624)\n6c155ba332256680b4c44f78b8c703f5a45992f35cb28e5c897fbd9b22e52624\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.905 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[b1b7eacc-2b5a-4ce5-b1b4-a96df3852912]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.906 104041 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/12281049-d2b1-40ef-9535-ec69961f84f0.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.906 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4f3ea0-f58b-4a88-b001-0dfb67f7fcba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.907 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12281049-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.908 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 kernel: tap12281049-d0: left promiscuous mode
Dec 05 06:35:35 compute-0 nova_compute[186329]: 2025-12-05 06:35:35.930 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.931 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[480537fe-f8b4-4876-838b-f944d4b3f4aa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.938 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[0b171a1c-f80b-43ff-a9fc-e0c874a6ab8c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.938 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[dc190a78-c37e-4cee-8961-7366371e5abc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.951 206383 DEBUG oslo.privsep.daemon [-] privsep: reply[9100a9cd-acc9-40f1-bfcc-73f6409f168f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 441332, 'reachable_time': 27227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216232, 'error': None, 'target': 'ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d12281049\x2dd2b1\x2d40ef\x2d9535\x2dec69961f84f0.mount: Deactivated successfully.
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.955 104153 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-12281049-d2b1-40ef-9535-ec69961f84f0 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Dec 05 06:35:35 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:35.955 104153 DEBUG oslo.privsep.daemon [-] privsep: reply[b1dc2cad-661b-4f1f-b681-04d3f92b34c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.592 186333 INFO nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Post operation of migration started
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.593 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.688 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.689 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.764 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.764 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquired lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Dec 05 06:35:36 compute-0 nova_compute[186329]: 2025-12-05 06:35:36.764 186333 DEBUG nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Dec 05 06:35:37 compute-0 nova_compute[186329]: 2025-12-05 06:35:37.269 186333 WARNING neutronclient.v2_0.client [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Dec 05 06:35:37 compute-0 nova_compute[186329]: 2025-12-05 06:35:37.761 186333 INFO nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Port 1f310d80-5c1c-45b1-8d19-1bf63071749b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Dec 05 06:35:37 compute-0 nova_compute[186329]: 2025-12-05 06:35:37.762 186333 DEBUG nova.network.neutron [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.266 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Releasing lock "refresh_cache-d8184114-bc46-46e9-a8b6-2bedf31a51d9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.778 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.779 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.779 186333 DEBUG oslo_concurrency.lockutils [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.781 186333 ERROR nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Unexpected error during post live migration at destination host.: nova.exception.InstanceNotFound: Instance d8184114-bc46-46e9-a8b6-2bedf31a51d9 could not be found.
Dec 05 06:35:38 compute-0 nova_compute[186329]: 2025-12-05 06:35:38.781 186333 DEBUG nova.compute.manager [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1829
Dec 05 06:35:39 compute-0 nova_compute[186329]: 2025-12-05 06:35:39.077 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:39 compute-0 nova_compute[186329]: 2025-12-05 06:35:39.287 186333 DEBUG nova.objects.instance [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] [instance: d8184114-bc46-46e9-a8b6-2bedf31a51d9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Dec 05 06:35:40 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:35:40.507 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:35:40 compute-0 nova_compute[186329]: 2025-12-05 06:35:40.881 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server [None req-65a386ed-c5be-49ca-971e-6d6464712f6d e80eb9b0343d45d5892eedc9dac67ae8 d8fe610270ef4e7f8f4c5bb46d2f9b58 - - default default] Exception during message handling: nova.exception_Remote.UnexpectedTaskStateError_Remote: Conflict updating instance d8184114-bc46-46e9-a8b6-2bedf31a51d9. Expected: {'task_state': ['migrating']}. Actual: {'task_state': None}
Dec 05 06:35:41 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2386, in _instance_update
Dec 05 06:35:41 compute-0 nova_compute[186329]:     update_on_match(compare, 'uuid', updates)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/orm.py", line 52, in update_on_match
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return update_match.update_on_match(
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Dec 05 06:35:41 compute-0 nova_compute[186329]:     raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: During handling of the above exception, another exception occurred:
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return getattr(target, method)(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return fn(self, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:35:41 compute-0 nova_compute[186329]:     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:35:41 compute-0 nova_compute[186329]:                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]:     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:35:41 compute-0 nova_compute[186329]:          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:35:41 compute-0 nova_compute[186329]:     self.force_reraise()
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:35:41 compute-0 nova_compute[186329]:     raise self.value
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return f(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return f(context, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2303, in instance_update_and_get_original
Dec 05 06:35:41 compute-0 nova_compute[186329]:     return (copy.copy(instance_ref), _instance_update(
Dec 05 06:35:41 compute-0 nova_compute[186329]:                                      ^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]:   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2445, in _instance_update
Dec 05 06:35:41 compute-0 nova_compute[186329]:     raise exc(**exc_props)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 
Dec 05 06:35:41 compute-0 nova_compute[186329]: nova.exception.UnexpectedTaskStateError: Conflict updating instance d8184114-bc46-46e9-a8b6-2bedf31a51d9. Expected: {'task_state': ['migrating']}. Actual: {'task_state': None}
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 682, in _get_domain
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return conn.lookupByUUIDString(instance.uuid)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 186, in doit
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     result = proxy_call(self._autowrap, f, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 144, in proxy_call
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     rv = execute(f, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 125, in execute
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise e.with_traceback(tb)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/eventlet/tpool.py", line 82, in tworker
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     rv = meth(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.12/site-packages/libvirt.py", line 5121, in lookupByUUIDString
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise libvirtError('virDomainLookupByUUIDString() failed')
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server libvirt.libvirtError: Domain not found: no domain with matching uuid 'd8184114-bc46-46e9-a8b6-2bedf31a51d9'
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10205, in post_live_migration_at_destination
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10201, in post_live_migration_at_destination
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self.driver.post_live_migration_at_destination(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 12076, in post_live_migration_at_destination
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self._reattach_instance_vifs(context, instance, network_info)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py", line 11621, in _reattach_instance_vifs
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     guest = self._host.get_guest(instance)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 666, in get_guest
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return libvirt_guest.Guest(self._get_domain(instance))
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server                                ^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py", line 686, in _get_domain
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise exception.InstanceNotFound(instance_id=instance.uuid)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server nova.exception.InstanceNotFound: Instance d8184114-bc46-46e9-a8b6-2bedf31a51d9 could not be found.
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/server.py", line 174, in _process_incoming
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 65, in wrapped
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/utils.py", line 1483, in decorated_function
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 213, in decorated_function
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 203, in decorated_function
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/compute/manager.py", line 10237, in post_live_migration_at_destination
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     instance.save(expected_task_state=task_states.MIGRATING)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 209, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     updates, result = self.indirection_api.object_action(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/rpcapi.py", line 247, in object_action
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return cctxt.call(context, 'object_action', objinst=objinst,
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/rpc/client.py", line 180, in call
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     result = self.transport._send(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server              ^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/transport.py", line 123, in _send
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return self._driver.send(target, ctxt, message,
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 794, in send
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return self._send(target, ctxt, message, wait_for_reply, timeout,
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 786, in _send
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise result
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server nova.exception_Remote.UnexpectedTaskStateError_Remote: Conflict updating instance d8184114-bc46-46e9-a8b6-2bedf31a51d9. Expected: {'task_state': ['migrating']}. Actual: {'task_state': None}
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2386, in _instance_update
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     update_on_match(compare, 'uuid', updates)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/orm.py", line 52, in update_on_match
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return update_match.update_on_match(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/sqlalchemy/update_match.py", line 194, in update_on_match
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise NoRowsMatched("Zero rows matched for %d attempts" % attempts)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server oslo_db.sqlalchemy.update_match.NoRowsMatched: Zero rows matched for 3 attempts
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/conductor/manager.py", line 143, in _object_dispatch
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return getattr(target, method)(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_versionedobjects/base.py", line 226, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return fn(self, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/objects/instance.py", line 878, in save
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     old_ref, inst_ref = db.instance_update_and_get_original(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/utils.py", line 35, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 144, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception() as ectxt:
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     self.force_reraise()
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise self.value
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/oslo_db/api.py", line 142, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return f(*args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 207, in wrapper
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return f(context, *args, **kwargs)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server            ^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2303, in instance_update_and_get_original
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     return (copy.copy(instance_ref), _instance_update(
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server                                      ^^^^^^^^^^^^^^^^^
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.12/site-packages/nova/db/main/api.py", line 2445, in _instance_update
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server     raise exc(**exc_props)
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server nova.exception.UnexpectedTaskStateError: Conflict updating instance d8184114-bc46-46e9-a8b6-2bedf31a51d9. Expected: {'task_state': ['migrating']}. Actual: {'task_state': None}
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:41 compute-0 nova_compute[186329]: 2025-12-05 06:35:41.324 186333 ERROR oslo_messaging.rpc.server 
Dec 05 06:35:44 compute-0 nova_compute[186329]: 2025-12-05 06:35:44.079 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:45 compute-0 nova_compute[186329]: 2025-12-05 06:35:45.884 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:47 compute-0 podman[216250]: 2025-12-05 06:35:47.453176509 +0000 UTC m=+0.038157812 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:35:47 compute-0 podman[216249]: 2025-12-05 06:35:47.475408668 +0000 UTC m=+0.060751701 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true)
Dec 05 06:35:49 compute-0 nova_compute[186329]: 2025-12-05 06:35:49.081 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:50 compute-0 nova_compute[186329]: 2025-12-05 06:35:50.886 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:54 compute-0 nova_compute[186329]: 2025-12-05 06:35:54.083 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:55 compute-0 nova_compute[186329]: 2025-12-05 06:35:55.887 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:56 compute-0 podman[216294]: 2025-12-05 06:35:56.464642803 +0000 UTC m=+0.047166019 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc.)
Dec 05 06:35:56 compute-0 podman[216293]: 2025-12-05 06:35:56.465032736 +0000 UTC m=+0.049132605 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 05 06:35:56 compute-0 podman[216295]: 2025-12-05 06:35:56.496393689 +0000 UTC m=+0.076747054 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 05 06:35:59 compute-0 nova_compute[186329]: 2025-12-05 06:35:59.083 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:35:59 compute-0 podman[196599]: time="2025-12-05T06:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:35:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:35:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 05 06:36:00 compute-0 nova_compute[186329]: 2025-12-05 06:36:00.890 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: ERROR   06:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: ERROR   06:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: ERROR   06:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: ERROR   06:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: ERROR   06:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:36:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:36:04 compute-0 nova_compute[186329]: 2025-12-05 06:36:04.084 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:04 compute-0 nova_compute[186329]: 2025-12-05 06:36:04.213 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:04 compute-0 nova_compute[186329]: 2025-12-05 06:36:04.213 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:04 compute-0 nova_compute[186329]: 2025-12-05 06:36:04.214 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:36:04 compute-0 nova_compute[186329]: 2025-12-05 06:36:04.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:05 compute-0 nova_compute[186329]: 2025-12-05 06:36:05.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:05 compute-0 nova_compute[186329]: 2025-12-05 06:36:05.893 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.225 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.226 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.226 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.226 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.403 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.404 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.418 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.419 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5812MB free_disk=73.16161727905273GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.420 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:36:06 compute-0 nova_compute[186329]: 2025-12-05 06:36:06.420 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:36:06 compute-0 ovn_controller[95223]: 2025-12-05T06:36:06Z|00243|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec 05 06:36:08 compute-0 nova_compute[186329]: 2025-12-05 06:36:08.040 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:36:08 compute-0 nova_compute[186329]: 2025-12-05 06:36:08.041 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:36:08 compute-0 nova_compute[186329]: 2025-12-05 06:36:08.041 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:36:06 up  1:14,  0 user,  load average: 0.09, 0.08, 0.16\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:36:08 compute-0 nova_compute[186329]: 2025-12-05 06:36:08.157 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:36:08 compute-0 nova_compute[186329]: 2025-12-05 06:36:08.663 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:36:09 compute-0 nova_compute[186329]: 2025-12-05 06:36:09.085 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:09 compute-0 nova_compute[186329]: 2025-12-05 06:36:09.168 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:36:09 compute-0 nova_compute[186329]: 2025-12-05 06:36:09.169 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.749s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:36:10 compute-0 nova_compute[186329]: 2025-12-05 06:36:10.169 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:10 compute-0 nova_compute[186329]: 2025-12-05 06:36:10.170 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:10 compute-0 nova_compute[186329]: 2025-12-05 06:36:10.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:10 compute-0 nova_compute[186329]: 2025-12-05 06:36:10.895 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:14 compute-0 nova_compute[186329]: 2025-12-05 06:36:14.087 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:14 compute-0 nova_compute[186329]: 2025-12-05 06:36:14.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:36:15 compute-0 nova_compute[186329]: 2025-12-05 06:36:15.897 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:18 compute-0 podman[216347]: 2025-12-05 06:36:18.453361705 +0000 UTC m=+0.038390816 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:36:18 compute-0 podman[216346]: 2025-12-05 06:36:18.464999094 +0000 UTC m=+0.051563041 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_id=ovn_controller)
Dec 05 06:36:19 compute-0 nova_compute[186329]: 2025-12-05 06:36:19.088 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:20 compute-0 nova_compute[186329]: 2025-12-05 06:36:20.899 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:24 compute-0 nova_compute[186329]: 2025-12-05 06:36:24.089 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:25 compute-0 nova_compute[186329]: 2025-12-05 06:36:25.902 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:27 compute-0 podman[216392]: 2025-12-05 06:36:27.458358612 +0000 UTC m=+0.044014068 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, version=9.6, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Dec 05 06:36:27 compute-0 podman[216391]: 2025-12-05 06:36:27.473309039 +0000 UTC m=+0.059926365 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Dec 05 06:36:27 compute-0 podman[216393]: 2025-12-05 06:36:27.488331231 +0000 UTC m=+0.072282917 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 06:36:29 compute-0 nova_compute[186329]: 2025-12-05 06:36:29.090 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:36:29.524 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:36:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:36:29.524 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:36:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:36:29.524 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:36:29 compute-0 podman[196599]: time="2025-12-05T06:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:36:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:36:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:36:30 compute-0 nova_compute[186329]: 2025-12-05 06:36:30.904 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: ERROR   06:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: ERROR   06:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: ERROR   06:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: ERROR   06:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: ERROR   06:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:36:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:36:34 compute-0 nova_compute[186329]: 2025-12-05 06:36:34.091 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:35 compute-0 nova_compute[186329]: 2025-12-05 06:36:35.906 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:39 compute-0 nova_compute[186329]: 2025-12-05 06:36:39.092 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:40 compute-0 nova_compute[186329]: 2025-12-05 06:36:40.908 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:44 compute-0 nova_compute[186329]: 2025-12-05 06:36:44.093 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:45 compute-0 nova_compute[186329]: 2025-12-05 06:36:45.911 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:49 compute-0 nova_compute[186329]: 2025-12-05 06:36:49.095 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:49 compute-0 podman[216445]: 2025-12-05 06:36:49.459459841 +0000 UTC m=+0.041845502 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:36:49 compute-0 podman[216444]: 2025-12-05 06:36:49.47265017 +0000 UTC m=+0.056625058 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 05 06:36:50 compute-0 nova_compute[186329]: 2025-12-05 06:36:50.913 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:54 compute-0 nova_compute[186329]: 2025-12-05 06:36:54.096 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:55 compute-0 nova_compute[186329]: 2025-12-05 06:36:55.914 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:58 compute-0 podman[216489]: 2025-12-05 06:36:58.46048927 +0000 UTC m=+0.046108737 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:36:58 compute-0 podman[216490]: 2025-12-05 06:36:58.4629731 +0000 UTC m=+0.044475859 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, container_name=openstack_network_exporter)
Dec 05 06:36:58 compute-0 podman[216491]: 2025-12-05 06:36:58.463057629 +0000 UTC m=+0.043863255 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:36:59 compute-0 nova_compute[186329]: 2025-12-05 06:36:59.097 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:36:59 compute-0 podman[196599]: time="2025-12-05T06:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:36:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:36:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec 05 06:37:00 compute-0 nova_compute[186329]: 2025-12-05 06:37:00.917 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: ERROR   06:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: ERROR   06:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: ERROR   06:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: ERROR   06:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: ERROR   06:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:37:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:37:04 compute-0 nova_compute[186329]: 2025-12-05 06:37:04.099 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:05 compute-0 nova_compute[186329]: 2025-12-05 06:37:05.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:05 compute-0 nova_compute[186329]: 2025-12-05 06:37:05.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:05 compute-0 nova_compute[186329]: 2025-12-05 06:37:05.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:37:05 compute-0 nova_compute[186329]: 2025-12-05 06:37:05.919 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:06 compute-0 nova_compute[186329]: 2025-12-05 06:37:06.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:06 compute-0 nova_compute[186329]: 2025-12-05 06:37:06.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:06 compute-0 nova_compute[186329]: 2025-12-05 06:37:06.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.224 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.389 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.390 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.403 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.403 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5832MB free_disk=73.16160202026367GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.404 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:37:07 compute-0 nova_compute[186329]: 2025-12-05 06:37:07.404 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:37:08 compute-0 nova_compute[186329]: 2025-12-05 06:37:08.941 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:37:08 compute-0 nova_compute[186329]: 2025-12-05 06:37:08.942 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:37:08 compute-0 nova_compute[186329]: 2025-12-05 06:37:08.942 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:37:07 up  1:15,  0 user,  load average: 0.11, 0.08, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:37:08 compute-0 nova_compute[186329]: 2025-12-05 06:37:08.986 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:37:09 compute-0 nova_compute[186329]: 2025-12-05 06:37:09.100 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:09 compute-0 nova_compute[186329]: 2025-12-05 06:37:09.490 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:37:09 compute-0 nova_compute[186329]: 2025-12-05 06:37:09.996 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:37:09 compute-0 nova_compute[186329]: 2025-12-05 06:37:09.996 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.592s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:37:10 compute-0 nova_compute[186329]: 2025-12-05 06:37:10.922 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:14 compute-0 nova_compute[186329]: 2025-12-05 06:37:14.101 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:14 compute-0 nova_compute[186329]: 2025-12-05 06:37:14.996 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:15 compute-0 nova_compute[186329]: 2025-12-05 06:37:15.503 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:15 compute-0 nova_compute[186329]: 2025-12-05 06:37:15.504 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:15 compute-0 nova_compute[186329]: 2025-12-05 06:37:15.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:37:15 compute-0 nova_compute[186329]: 2025-12-05 06:37:15.924 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:19 compute-0 nova_compute[186329]: 2025-12-05 06:37:19.103 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:20 compute-0 podman[216542]: 2025-12-05 06:37:20.456427583 +0000 UTC m=+0.035699307 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 05 06:37:20 compute-0 podman[216541]: 2025-12-05 06:37:20.473042901 +0000 UTC m=+0.054037144 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 05 06:37:20 compute-0 nova_compute[186329]: 2025-12-05 06:37:20.925 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:24 compute-0 nova_compute[186329]: 2025-12-05 06:37:24.103 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:25 compute-0 nova_compute[186329]: 2025-12-05 06:37:25.927 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:29 compute-0 nova_compute[186329]: 2025-12-05 06:37:29.104 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:29 compute-0 podman[216587]: 2025-12-05 06:37:29.458500665 +0000 UTC m=+0.038391429 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd)
Dec 05 06:37:29 compute-0 podman[216586]: 2025-12-05 06:37:29.463420885 +0000 UTC m=+0.045319513 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Dec 05 06:37:29 compute-0 podman[216585]: 2025-12-05 06:37:29.485350695 +0000 UTC m=+0.068810889 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:37:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:37:29.525 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:37:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:37:29.525 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:37:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:37:29.525 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:37:29 compute-0 podman[196599]: time="2025-12-05T06:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:37:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:37:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 05 06:37:30 compute-0 nova_compute[186329]: 2025-12-05 06:37:30.929 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: ERROR   06:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: ERROR   06:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: ERROR   06:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: ERROR   06:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: ERROR   06:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:37:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:37:34 compute-0 nova_compute[186329]: 2025-12-05 06:37:34.106 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:35 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 05 06:37:35 compute-0 nova_compute[186329]: 2025-12-05 06:37:35.931 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:39 compute-0 nova_compute[186329]: 2025-12-05 06:37:39.107 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:40 compute-0 nova_compute[186329]: 2025-12-05 06:37:40.933 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:44 compute-0 nova_compute[186329]: 2025-12-05 06:37:44.108 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:45 compute-0 nova_compute[186329]: 2025-12-05 06:37:45.936 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:49 compute-0 nova_compute[186329]: 2025-12-05 06:37:49.109 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:50 compute-0 nova_compute[186329]: 2025-12-05 06:37:50.938 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:51 compute-0 podman[216640]: 2025-12-05 06:37:51.450707232 +0000 UTC m=+0.035793564 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:37:51 compute-0 podman[216639]: 2025-12-05 06:37:51.470399434 +0000 UTC m=+0.056945711 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:37:54 compute-0 nova_compute[186329]: 2025-12-05 06:37:54.110 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:55 compute-0 nova_compute[186329]: 2025-12-05 06:37:55.940 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:59 compute-0 nova_compute[186329]: 2025-12-05 06:37:59.110 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:37:59 compute-0 podman[196599]: time="2025-12-05T06:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:37:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:37:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:38:00 compute-0 podman[216683]: 2025-12-05 06:38:00.459363951 +0000 UTC m=+0.042758308 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal)
Dec 05 06:38:00 compute-0 podman[216684]: 2025-12-05 06:38:00.465498764 +0000 UTC m=+0.047106232 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 05 06:38:00 compute-0 podman[216682]: 2025-12-05 06:38:00.484361086 +0000 UTC m=+0.069135288 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec)
Dec 05 06:38:00 compute-0 nova_compute[186329]: 2025-12-05 06:38:00.942 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: ERROR   06:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: ERROR   06:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: ERROR   06:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: ERROR   06:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: ERROR   06:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:38:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:38:04 compute-0 nova_compute[186329]: 2025-12-05 06:38:04.111 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:05 compute-0 nova_compute[186329]: 2025-12-05 06:38:05.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:05 compute-0 nova_compute[186329]: 2025-12-05 06:38:05.944 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:06 compute-0 nova_compute[186329]: 2025-12-05 06:38:06.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.225 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.382 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.383 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.396 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.396 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.16239547729492GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.397 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:38:07 compute-0 nova_compute[186329]: 2025-12-05 06:38:07.397 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.948 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.948 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.949 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:38:07 up  1:16,  0 user,  load average: 0.28, 0.11, 0.15\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.966 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing inventories for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.986 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating ProviderTree inventory for provider f2df025e-56e9-4920-9fad-1a12202c4aeb from _refresh_and_get_inventory using data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.987 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Updating inventory in ProviderTree for provider f2df025e-56e9-4920-9fad-1a12202c4aeb with inventory: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Dec 05 06:38:08 compute-0 nova_compute[186329]: 2025-12-05 06:38:08.999 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing aggregate associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Dec 05 06:38:09 compute-0 nova_compute[186329]: 2025-12-05 06:38:09.017 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Refreshing trait associations for resource provider f2df025e-56e9-4920-9fad-1a12202c4aeb, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SOUND_MODEL_SB16,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIRTIO_PACKED,HW_CPU_X86_SSE2,COMPUTE_STORAGE_VIRTIO_FS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_CRB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SECURITY_TPM_TIS,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SOUND_MODEL_USB,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_ARCH_X86_64,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_INTEL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Dec 05 06:38:09 compute-0 nova_compute[186329]: 2025-12-05 06:38:09.049 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:38:09 compute-0 nova_compute[186329]: 2025-12-05 06:38:09.113 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:09 compute-0 nova_compute[186329]: 2025-12-05 06:38:09.554 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:38:10 compute-0 nova_compute[186329]: 2025-12-05 06:38:10.062 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:38:10 compute-0 nova_compute[186329]: 2025-12-05 06:38:10.062 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.665s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:38:10 compute-0 nova_compute[186329]: 2025-12-05 06:38:10.946 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:11 compute-0 nova_compute[186329]: 2025-12-05 06:38:11.062 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:11 compute-0 nova_compute[186329]: 2025-12-05 06:38:11.062 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:11 compute-0 nova_compute[186329]: 2025-12-05 06:38:11.063 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:11 compute-0 nova_compute[186329]: 2025-12-05 06:38:11.063 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:38:11 compute-0 nova_compute[186329]: 2025-12-05 06:38:11.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:13 compute-0 nova_compute[186329]: 2025-12-05 06:38:13.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:14 compute-0 nova_compute[186329]: 2025-12-05 06:38:14.115 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:15 compute-0 nova_compute[186329]: 2025-12-05 06:38:15.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:38:15 compute-0 nova_compute[186329]: 2025-12-05 06:38:15.949 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:19 compute-0 nova_compute[186329]: 2025-12-05 06:38:19.116 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:20 compute-0 nova_compute[186329]: 2025-12-05 06:38:20.951 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:22 compute-0 podman[216736]: 2025-12-05 06:38:22.456474797 +0000 UTC m=+0.040119698 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:38:22 compute-0 podman[216735]: 2025-12-05 06:38:22.471041153 +0000 UTC m=+0.056422849 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 05 06:38:24 compute-0 nova_compute[186329]: 2025-12-05 06:38:24.118 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:25 compute-0 nova_compute[186329]: 2025-12-05 06:38:25.953 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:29 compute-0 nova_compute[186329]: 2025-12-05 06:38:29.119 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:38:29.526 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:38:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:38:29.526 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:38:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:38:29.526 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:38:29 compute-0 podman[196599]: time="2025-12-05T06:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:38:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:38:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec 05 06:38:30 compute-0 nova_compute[186329]: 2025-12-05 06:38:30.955 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: ERROR   06:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: ERROR   06:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: ERROR   06:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: ERROR   06:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: ERROR   06:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:38:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:38:31 compute-0 podman[216781]: 2025-12-05 06:38:31.469396003 +0000 UTC m=+0.049760172 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 05 06:38:31 compute-0 podman[216783]: 2025-12-05 06:38:31.471751572 +0000 UTC m=+0.049654364 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 05 06:38:31 compute-0 podman[216782]: 2025-12-05 06:38:31.481346701 +0000 UTC m=+0.058801620 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6)
Dec 05 06:38:34 compute-0 nova_compute[186329]: 2025-12-05 06:38:34.121 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:35 compute-0 nova_compute[186329]: 2025-12-05 06:38:35.957 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:39 compute-0 nova_compute[186329]: 2025-12-05 06:38:39.122 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:40 compute-0 nova_compute[186329]: 2025-12-05 06:38:40.959 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:44 compute-0 nova_compute[186329]: 2025-12-05 06:38:44.125 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:45 compute-0 nova_compute[186329]: 2025-12-05 06:38:45.960 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:49 compute-0 nova_compute[186329]: 2025-12-05 06:38:49.126 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:50 compute-0 nova_compute[186329]: 2025-12-05 06:38:50.964 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:53 compute-0 podman[216833]: 2025-12-05 06:38:53.466355306 +0000 UTC m=+0.053389665 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 05 06:38:53 compute-0 podman[216834]: 2025-12-05 06:38:53.475467508 +0000 UTC m=+0.061169793 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:38:54 compute-0 nova_compute[186329]: 2025-12-05 06:38:54.130 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:55 compute-0 nova_compute[186329]: 2025-12-05 06:38:55.966 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:59 compute-0 nova_compute[186329]: 2025-12-05 06:38:59.130 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:38:59 compute-0 podman[196599]: time="2025-12-05T06:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:38:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:38:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 05 06:39:00 compute-0 nova_compute[186329]: 2025-12-05 06:39:00.968 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: ERROR   06:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: ERROR   06:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: ERROR   06:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: ERROR   06:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: ERROR   06:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:39:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:39:02 compute-0 podman[216878]: 2025-12-05 06:39:02.456396358 +0000 UTC m=+0.042142772 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 05 06:39:02 compute-0 podman[216880]: 2025-12-05 06:39:02.473375329 +0000 UTC m=+0.054650166 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 05 06:39:02 compute-0 podman[216879]: 2025-12-05 06:39:02.489864409 +0000 UTC m=+0.073517727 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:39:04 compute-0 nova_compute[186329]: 2025-12-05 06:39:04.131 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:05 compute-0 nova_compute[186329]: 2025-12-05 06:39:05.971 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:06 compute-0 nova_compute[186329]: 2025-12-05 06:39:06.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:07 compute-0 nova_compute[186329]: 2025-12-05 06:39:07.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:07 compute-0 nova_compute[186329]: 2025-12-05 06:39:07.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:07 compute-0 nova_compute[186329]: 2025-12-05 06:39:07.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:39:07 compute-0 nova_compute[186329]: 2025-12-05 06:39:07.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.222 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.393 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.394 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.408 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.408 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5859MB free_disk=73.16239547729492GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.409 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:39:08 compute-0 nova_compute[186329]: 2025-12-05 06:39:08.409 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:39:09 compute-0 nova_compute[186329]: 2025-12-05 06:39:09.132 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:09 compute-0 nova_compute[186329]: 2025-12-05 06:39:09.947 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:39:09 compute-0 nova_compute[186329]: 2025-12-05 06:39:09.948 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:39:09 compute-0 nova_compute[186329]: 2025-12-05 06:39:09.948 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:39:08 up  1:17,  0 user,  load average: 0.10, 0.09, 0.14\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:39:09 compute-0 nova_compute[186329]: 2025-12-05 06:39:09.975 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:39:10 compute-0 nova_compute[186329]: 2025-12-05 06:39:10.479 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:39:10 compute-0 nova_compute[186329]: 2025-12-05 06:39:10.973 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:10 compute-0 nova_compute[186329]: 2025-12-05 06:39:10.985 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:39:10 compute-0 nova_compute[186329]: 2025-12-05 06:39:10.985 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.576s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:39:13 compute-0 nova_compute[186329]: 2025-12-05 06:39:13.980 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:13 compute-0 nova_compute[186329]: 2025-12-05 06:39:13.981 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:13 compute-0 nova_compute[186329]: 2025-12-05 06:39:13.981 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:14 compute-0 nova_compute[186329]: 2025-12-05 06:39:14.135 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:15 compute-0 nova_compute[186329]: 2025-12-05 06:39:15.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:15 compute-0 nova_compute[186329]: 2025-12-05 06:39:15.976 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:18 compute-0 nova_compute[186329]: 2025-12-05 06:39:18.705 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:39:19 compute-0 nova_compute[186329]: 2025-12-05 06:39:19.137 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:20 compute-0 nova_compute[186329]: 2025-12-05 06:39:20.978 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:24 compute-0 nova_compute[186329]: 2025-12-05 06:39:24.139 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:24 compute-0 podman[216933]: 2025-12-05 06:39:24.452803181 +0000 UTC m=+0.034375408 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:39:24 compute-0 podman[216932]: 2025-12-05 06:39:24.474450789 +0000 UTC m=+0.056533605 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:39:25 compute-0 nova_compute[186329]: 2025-12-05 06:39:25.980 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:29 compute-0 nova_compute[186329]: 2025-12-05 06:39:29.140 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:39:29.527 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:39:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:39:29.527 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:39:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:39:29.527 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:39:29 compute-0 podman[196599]: time="2025-12-05T06:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:39:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:39:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec 05 06:39:30 compute-0 nova_compute[186329]: 2025-12-05 06:39:30.982 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: ERROR   06:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: ERROR   06:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: ERROR   06:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: ERROR   06:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: ERROR   06:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:39:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:39:33 compute-0 podman[216979]: 2025-12-05 06:39:33.449909463 +0000 UTC m=+0.035704227 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 05 06:39:33 compute-0 podman[216980]: 2025-12-05 06:39:33.46487542 +0000 UTC m=+0.048061108 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:39:33 compute-0 podman[216981]: 2025-12-05 06:39:33.46810332 +0000 UTC m=+0.049395337 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:39:34 compute-0 nova_compute[186329]: 2025-12-05 06:39:34.142 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:35 compute-0 nova_compute[186329]: 2025-12-05 06:39:35.984 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:39 compute-0 nova_compute[186329]: 2025-12-05 06:39:39.143 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:40 compute-0 nova_compute[186329]: 2025-12-05 06:39:40.986 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:44 compute-0 nova_compute[186329]: 2025-12-05 06:39:44.144 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:45 compute-0 nova_compute[186329]: 2025-12-05 06:39:45.989 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:49 compute-0 nova_compute[186329]: 2025-12-05 06:39:49.145 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:50 compute-0 nova_compute[186329]: 2025-12-05 06:39:50.991 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:54 compute-0 nova_compute[186329]: 2025-12-05 06:39:54.146 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:55 compute-0 podman[217031]: 2025-12-05 06:39:55.453366403 +0000 UTC m=+0.038662078 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 05 06:39:55 compute-0 podman[217030]: 2025-12-05 06:39:55.471448407 +0000 UTC m=+0.057886510 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:39:55 compute-0 nova_compute[186329]: 2025-12-05 06:39:55.992 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:59 compute-0 nova_compute[186329]: 2025-12-05 06:39:59.147 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:39:59 compute-0 podman[196599]: time="2025-12-05T06:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:39:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:39:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:40:00 compute-0 nova_compute[186329]: 2025-12-05 06:40:00.994 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: ERROR   06:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: ERROR   06:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: ERROR   06:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: ERROR   06:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: ERROR   06:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:40:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:40:01 compute-0 nova_compute[186329]: 2025-12-05 06:40:01.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:04 compute-0 nova_compute[186329]: 2025-12-05 06:40:04.149 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:04 compute-0 podman[217075]: 2025-12-05 06:40:04.456556715 +0000 UTC m=+0.042134085 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 06:40:04 compute-0 podman[217076]: 2025-12-05 06:40:04.459940797 +0000 UTC m=+0.043684289 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7)
Dec 05 06:40:04 compute-0 podman[217077]: 2025-12-05 06:40:04.488999799 +0000 UTC m=+0.070804686 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 05 06:40:05 compute-0 nova_compute[186329]: 2025-12-05 06:40:05.996 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:08 compute-0 nova_compute[186329]: 2025-12-05 06:40:08.216 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:08 compute-0 nova_compute[186329]: 2025-12-05 06:40:08.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:08 compute-0 nova_compute[186329]: 2025-12-05 06:40:08.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.150 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.220 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.221 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.221 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.403 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.404 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.417 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.417 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5877MB free_disk=73.16241455078125GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.418 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:40:09 compute-0 nova_compute[186329]: 2025-12-05 06:40:09.418 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:40:10 compute-0 nova_compute[186329]: 2025-12-05 06:40:10.957 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:40:10 compute-0 nova_compute[186329]: 2025-12-05 06:40:10.957 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:40:10 compute-0 nova_compute[186329]: 2025-12-05 06:40:10.957 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:40:09 up  1:18,  0 user,  load average: 0.04, 0.07, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:40:10 compute-0 nova_compute[186329]: 2025-12-05 06:40:10.998 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:11 compute-0 nova_compute[186329]: 2025-12-05 06:40:11.017 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:40:11 compute-0 nova_compute[186329]: 2025-12-05 06:40:11.521 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:40:12 compute-0 nova_compute[186329]: 2025-12-05 06:40:12.029 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:40:12 compute-0 nova_compute[186329]: 2025-12-05 06:40:12.030 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.612s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:40:13 compute-0 nova_compute[186329]: 2025-12-05 06:40:13.030 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:13 compute-0 nova_compute[186329]: 2025-12-05 06:40:13.031 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:13 compute-0 nova_compute[186329]: 2025-12-05 06:40:13.031 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:40:14 compute-0 nova_compute[186329]: 2025-12-05 06:40:14.152 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:14 compute-0 nova_compute[186329]: 2025-12-05 06:40:14.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:14 compute-0 nova_compute[186329]: 2025-12-05 06:40:14.710 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:16 compute-0 nova_compute[186329]: 2025-12-05 06:40:16.000 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:16 compute-0 nova_compute[186329]: 2025-12-05 06:40:16.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:19 compute-0 nova_compute[186329]: 2025-12-05 06:40:19.154 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:20 compute-0 nova_compute[186329]: 2025-12-05 06:40:20.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:20 compute-0 nova_compute[186329]: 2025-12-05 06:40:20.709 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11913
Dec 05 06:40:21 compute-0 nova_compute[186329]: 2025-12-05 06:40:21.003 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:21 compute-0 nova_compute[186329]: 2025-12-05 06:40:21.214 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11922
Dec 05 06:40:24 compute-0 nova_compute[186329]: 2025-12-05 06:40:24.156 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:26 compute-0 nova_compute[186329]: 2025-12-05 06:40:26.005 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:26 compute-0 podman[217130]: 2025-12-05 06:40:26.453559479 +0000 UTC m=+0.034738319 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:40:26 compute-0 podman[217129]: 2025-12-05 06:40:26.482524846 +0000 UTC m=+0.065053825 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Dec 05 06:40:29 compute-0 nova_compute[186329]: 2025-12-05 06:40:29.157 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:29.528 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:40:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:29.529 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:40:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:29.529 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:40:29 compute-0 podman[196599]: time="2025-12-05T06:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:40:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:40:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec 05 06:40:31 compute-0 nova_compute[186329]: 2025-12-05 06:40:31.007 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: ERROR   06:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: ERROR   06:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: ERROR   06:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: ERROR   06:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: ERROR   06:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:40:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:40:34 compute-0 nova_compute[186329]: 2025-12-05 06:40:34.158 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:34 compute-0 nova_compute[186329]: 2025-12-05 06:40:34.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:40:34 compute-0 nova_compute[186329]: 2025-12-05 06:40:34.710 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11951
Dec 05 06:40:35 compute-0 podman[217174]: 2025-12-05 06:40:35.458519528 +0000 UTC m=+0.044136048 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Dec 05 06:40:35 compute-0 podman[217175]: 2025-12-05 06:40:35.458753378 +0000 UTC m=+0.042709137 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 05 06:40:35 compute-0 podman[217176]: 2025-12-05 06:40:35.488418248 +0000 UTC m=+0.070454147 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:40:36 compute-0 nova_compute[186329]: 2025-12-05 06:40:36.009 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:39 compute-0 nova_compute[186329]: 2025-12-05 06:40:39.159 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:41 compute-0 nova_compute[186329]: 2025-12-05 06:40:41.011 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:44 compute-0 nova_compute[186329]: 2025-12-05 06:40:44.161 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:46 compute-0 nova_compute[186329]: 2025-12-05 06:40:46.013 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:49 compute-0 nova_compute[186329]: 2025-12-05 06:40:49.163 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:51 compute-0 nova_compute[186329]: 2025-12-05 06:40:51.016 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:53.008 104041 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '46:84:d1', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': 'a6:99:88:db:d9:a2'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Dec 05 06:40:53 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:53.009 104041 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Dec 05 06:40:53 compute-0 nova_compute[186329]: 2025-12-05 06:40:53.009 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:54 compute-0 nova_compute[186329]: 2025-12-05 06:40:54.164 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:54 compute-0 nova_compute[186329]: 2025-12-05 06:40:54.371 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:56 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:40:56.010 104041 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=89d40815-76f5-4f1d-9077-84d831b7d6c4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 05 06:40:56 compute-0 nova_compute[186329]: 2025-12-05 06:40:56.017 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:57 compute-0 podman[217226]: 2025-12-05 06:40:57.469380255 +0000 UTC m=+0.056133253 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Dec 05 06:40:57 compute-0 podman[217227]: 2025-12-05 06:40:57.479493098 +0000 UTC m=+0.065269499 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 05 06:40:59 compute-0 nova_compute[186329]: 2025-12-05 06:40:59.166 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:40:59 compute-0 podman[196599]: time="2025-12-05T06:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:40:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:40:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec 05 06:41:01 compute-0 nova_compute[186329]: 2025-12-05 06:41:01.018 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: ERROR   06:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: ERROR   06:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: ERROR   06:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: ERROR   06:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: ERROR   06:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:41:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:41:04 compute-0 nova_compute[186329]: 2025-12-05 06:41:04.167 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:06 compute-0 nova_compute[186329]: 2025-12-05 06:41:06.021 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:06 compute-0 podman[217270]: 2025-12-05 06:41:06.458790493 +0000 UTC m=+0.041893132 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 05 06:41:06 compute-0 podman[217272]: 2025-12-05 06:41:06.462397935 +0000 UTC m=+0.040751685 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 05 06:41:06 compute-0 podman[217271]: 2025-12-05 06:41:06.493461835 +0000 UTC m=+0.074544195 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 05 06:41:08 compute-0 nova_compute[186329]: 2025-12-05 06:41:08.215 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:08 compute-0 nova_compute[186329]: 2025-12-05 06:41:08.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:09 compute-0 nova_compute[186329]: 2025-12-05 06:41:09.170 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:09 compute-0 nova_compute[186329]: 2025-12-05 06:41:09.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.226 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.227 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.227 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.227 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.406 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.407 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.420 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.420 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5890MB free_disk=73.16239547729492GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.420 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:41:10 compute-0 nova_compute[186329]: 2025-12-05 06:41:10.421 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:41:11 compute-0 nova_compute[186329]: 2025-12-05 06:41:11.024 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:12 compute-0 nova_compute[186329]: 2025-12-05 06:41:12.125 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:41:12 compute-0 nova_compute[186329]: 2025-12-05 06:41:12.126 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:41:12 compute-0 nova_compute[186329]: 2025-12-05 06:41:12.126 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:41:10 up  1:19,  0 user,  load average: 0.15, 0.09, 0.13\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:41:12 compute-0 nova_compute[186329]: 2025-12-05 06:41:12.236 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:41:12 compute-0 nova_compute[186329]: 2025-12-05 06:41:12.742 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:41:13 compute-0 nova_compute[186329]: 2025-12-05 06:41:13.249 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:41:13 compute-0 nova_compute[186329]: 2025-12-05 06:41:13.249 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.828s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.170 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.249 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.249 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.249 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:14 compute-0 nova_compute[186329]: 2025-12-05 06:41:14.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:16 compute-0 nova_compute[186329]: 2025-12-05 06:41:16.026 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:17 compute-0 nova_compute[186329]: 2025-12-05 06:41:17.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:19 compute-0 nova_compute[186329]: 2025-12-05 06:41:19.172 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:21 compute-0 nova_compute[186329]: 2025-12-05 06:41:21.028 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:22 compute-0 nova_compute[186329]: 2025-12-05 06:41:22.704 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:41:24 compute-0 nova_compute[186329]: 2025-12-05 06:41:24.173 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:26 compute-0 nova_compute[186329]: 2025-12-05 06:41:26.030 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:26 compute-0 ovn_controller[95223]: 2025-12-05T06:41:26Z|00244|memory_trim|INFO|Detected inactivity (last active 30026 ms ago): trimming memory
Dec 05 06:41:28 compute-0 podman[217322]: 2025-12-05 06:41:28.450379079 +0000 UTC m=+0.035732630 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 05 06:41:28 compute-0 podman[217321]: 2025-12-05 06:41:28.493216525 +0000 UTC m=+0.079705940 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Dec 05 06:41:29 compute-0 nova_compute[186329]: 2025-12-05 06:41:29.174 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:41:29.530 104041 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:41:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:41:29.530 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:41:29 compute-0 ovn_metadata_agent[104036]: 2025-12-05 06:41:29.530 104041 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:41:29 compute-0 podman[196599]: time="2025-12-05T06:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:41:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:41:29 compute-0 podman[196599]: @ - - [05/Dec/2025:06:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec 05 06:41:31 compute-0 nova_compute[186329]: 2025-12-05 06:41:31.032 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: ERROR   06:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: ERROR   06:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: ERROR   06:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: ERROR   06:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: ERROR   06:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:41:31 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:41:34 compute-0 nova_compute[186329]: 2025-12-05 06:41:34.176 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:36 compute-0 nova_compute[186329]: 2025-12-05 06:41:36.034 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:37 compute-0 podman[217367]: 2025-12-05 06:41:37.475373941 +0000 UTC m=+0.058586736 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6)
Dec 05 06:41:37 compute-0 podman[217368]: 2025-12-05 06:41:37.491495681 +0000 UTC m=+0.073739013 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 05 06:41:37 compute-0 podman[217366]: 2025-12-05 06:41:37.491553501 +0000 UTC m=+0.076975678 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 05 06:41:39 compute-0 nova_compute[186329]: 2025-12-05 06:41:39.178 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:41 compute-0 nova_compute[186329]: 2025-12-05 06:41:41.036 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:44 compute-0 nova_compute[186329]: 2025-12-05 06:41:44.179 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:46 compute-0 nova_compute[186329]: 2025-12-05 06:41:46.038 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:49 compute-0 nova_compute[186329]: 2025-12-05 06:41:49.180 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:51 compute-0 nova_compute[186329]: 2025-12-05 06:41:51.040 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:54 compute-0 nova_compute[186329]: 2025-12-05 06:41:54.182 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:56 compute-0 nova_compute[186329]: 2025-12-05 06:41:56.042 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:59 compute-0 nova_compute[186329]: 2025-12-05 06:41:59.182 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:41:59 compute-0 sshd-session[217419]: Accepted publickey for zuul from 192.168.122.10 port 43652 ssh2: ECDSA SHA256:qs1gnk6DlsWBMwOXH28ouwL9ltr8rmS6SqK96Fn6Lw8
Dec 05 06:41:59 compute-0 systemd-logind[745]: New session 32 of user zuul.
Dec 05 06:41:59 compute-0 systemd[1]: Started Session 32 of User zuul.
Dec 05 06:41:59 compute-0 sshd-session[217419]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 05 06:41:59 compute-0 podman[217423]: 2025-12-05 06:41:59.334723342 +0000 UTC m=+0.039372793 container health_status f782184f7575011fc41381dbb7f97bc0c8712f26f940af8b7906c8021db9aaca (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 05 06:41:59 compute-0 podman[217421]: 2025-12-05 06:41:59.355114298 +0000 UTC m=+0.061713946 container health_status 9098af1895e92b8f667ebe9fa34226c436b33e3499b6f4af457edd8e635fe2b6 (image=quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ovn-controller:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 05 06:41:59 compute-0 sudo[217464]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 05 06:41:59 compute-0 sudo[217464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 05 06:41:59 compute-0 podman[196599]: time="2025-12-05T06:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 05 06:41:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17354 "" "Go-http-client/1.1"
Dec 05 06:41:59 compute-0 podman[196599]: @ - - [05/Dec/2025:06:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2588 "" "Go-http-client/1.1"
Dec 05 06:42:01 compute-0 nova_compute[186329]: 2025-12-05 06:42:01.043 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: ERROR   06:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: ERROR   06:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: ERROR   06:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: ERROR   06:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: ERROR   06:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 05 06:42:01 compute-0 openstack_network_exporter[198686]: 
Dec 05 06:42:04 compute-0 nova_compute[186329]: 2025-12-05 06:42:04.183 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:06 compute-0 nova_compute[186329]: 2025-12-05 06:42:06.044 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:06 compute-0 ovs-vsctl[217666]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 05 06:42:07 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 217491 (sos)
Dec 05 06:42:07 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 05 06:42:07 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 05 06:42:07 compute-0 virtqemud[186605]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 05 06:42:07 compute-0 virtqemud[186605]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 05 06:42:07 compute-0 virtqemud[186605]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 05 06:42:07 compute-0 podman[217874]: 2025-12-05 06:42:07.984909834 +0000 UTC m=+0.084010204 container health_status 836aa6ff54ca414f87d2c6cc7daa63cd346cd21c757ec06fc273563b3bc85ad0 (image=quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-multipathd:current', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 05 06:42:08 compute-0 podman[217866]: 2025-12-05 06:42:08.019327639 +0000 UTC m=+0.125512618 container health_status 09687ae2b7a2d58c700e5693e23e835bc9ed279c3012b0ff385310b9087bdb22 (image=quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.rdoproject.org/podified-master-centos10/openstack-neutron-metadata-agent-ovn:current', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=de09a328a4b55368e6f8fcc47010b7ec, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 05 06:42:08 compute-0 podman[217870]: 2025-12-05 06:42:08.021488773 +0000 UTC m=+0.130491820 container health_status 186362a602145f9b738120e6795f8d20d6998ef1c17c76f65275eb5b434a4454 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Dec 05 06:42:08 compute-0 crontab[218107]: (root) LIST (root)
Dec 05 06:42:08 compute-0 nova_compute[186329]: 2025-12-05 06:42:08.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:42:09 compute-0 nova_compute[186329]: 2025-12-05 06:42:09.184 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:09 compute-0 nova_compute[186329]: 2025-12-05 06:42:09.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.223 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.224 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.224 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.420 186333 WARNING nova.virt.libvirt.driver [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.421 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.437 186333 DEBUG oslo_concurrency.processutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.437 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5733MB free_disk=73.04262161254883GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_1f_2", "address": "0000:00:1f.2", "product_id": "2922", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2922", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_0", "address": "0000:00:1f.0", "product_id": "2918", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2918", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_3", "address": "0000:00:03.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_7", "address": "0000:00:03.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_6", "address": "0000:00:03.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_01_00_0", "address": "0000:01:00.0", "product_id": "000e", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000e", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_5", "address": "0000:00:02.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_1", "address": "0000:00:02.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_7", "address": "0000:00:02.7", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_3", "address": "0000:00:02.3", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_2", "address": "0000:00:03.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_1f_3", "address": "0000:00:1f.3", "product_id": "2930", "vendor_id": "8086", "numa_node": null, "label": "label_8086_2930", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_07_00_0", "address": "0000:07:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_1", "address": "0000:00:03.1", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "1041", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1041", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_05_00_0", "address": "0000:05:00.0", "product_id": "1045", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1045", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_04_00_0", "address": "0000:04:00.0", "product_id": "1042", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1042", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "29c0", "vendor_id": "8086", "numa_node": null, "label": "label_8086_29c0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_4", "address": "0000:00:03.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_6", "address": "0000:00:02.6", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_06_00_0", "address": "0000:06:00.0", "product_id": "1044", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1044", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_2", "address": "0000:00:02.2", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_5", "address": "0000:00:03.5", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_4", "address": "0000:00:02.4", "product_id": "000c", "vendor_id": "1b36", "numa_node": null, "label": "label_1b36_000c", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.438 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Dec 05 06:42:10 compute-0 nova_compute[186329]: 2025-12-05 06:42:10.438 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Dec 05 06:42:10 compute-0 systemd[1]: Starting Hostname Service...
Dec 05 06:42:10 compute-0 systemd[1]: Started Hostname Service.
Dec 05 06:42:11 compute-0 nova_compute[186329]: 2025-12-05 06:42:11.045 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:11 compute-0 nova_compute[186329]: 2025-12-05 06:42:11.997 186333 INFO nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Instance 83dcc9e9-9fcf-4e34-8b76-598c38ed9bc0 has allocations against this compute host but is not found in the database.
Dec 05 06:42:11 compute-0 nova_compute[186329]: 2025-12-05 06:42:11.997 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Total usable vcpus: 4, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Dec 05 06:42:11 compute-0 nova_compute[186329]: 2025-12-05 06:42:11.997 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7681MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=4 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 06:42:10 up  1:20,  0 user,  load average: 0.52, 0.21, 0.17\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Dec 05 06:42:12 compute-0 nova_compute[186329]: 2025-12-05 06:42:12.063 186333 DEBUG nova.compute.provider_tree [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed in ProviderTree for provider: f2df025e-56e9-4920-9fad-1a12202c4aeb update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Dec 05 06:42:12 compute-0 nova_compute[186329]: 2025-12-05 06:42:12.568 186333 DEBUG nova.scheduler.client.report [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Inventory has not changed for provider f2df025e-56e9-4920-9fad-1a12202c4aeb based on inventory data: {'VCPU': {'total': 4, 'reserved': 0, 'min_unit': 1, 'max_unit': 4, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7681, 'reserved': 512, 'min_unit': 1, 'max_unit': 7681, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Dec 05 06:42:13 compute-0 nova_compute[186329]: 2025-12-05 06:42:13.077 186333 DEBUG nova.compute.resource_tracker [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Dec 05 06:42:13 compute-0 nova_compute[186329]: 2025-12-05 06:42:13.077 186333 DEBUG oslo_concurrency.lockutils [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.639s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.077 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.078 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.078 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.078 186333 DEBUG nova.compute.manager [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11232
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.186 186333 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Dec 05 06:42:14 compute-0 nova_compute[186329]: 2025-12-05 06:42:14.709 186333 DEBUG oslo_service.periodic_task [None req-deac8098-c9e0-4b6e-ac7c-2fa094b7dad7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
